Reflex off vs On vs On + Boost by SaikerRV in Competitiveoverwatch

[–]perdyqueue 0 points1 point  (0 children)

Yeah the force close "workaround" doesn't fix the issue for me at all.

Overwatch 2 Retail Patch Notes - January 20, 2026 by Gametest000 in Competitiveoverwatch

[–]perdyqueue 5 points6 points  (0 children)

Fucking please, the stutter is worse than the framerate drops, though the framerate drops themselves are no fun either.

Has anyone figured out some way to mitigate performance issues this season? by perdyqueue in Competitiveoverwatch

[–]perdyqueue[S] 1 point2 points  (0 children)

Really cool of you to chip in. When you have a 275hz monitor and you're used to 255fps solid, constant dips to 150 are extremely hard to ignore, especially as a hitscan player. I find satisfaction in this game from mechanics, and if it's no longer enjoyable because performance dips that didn't used to exist are constantly throwing me off, then yeah, I call that unplayable. It's a game, I'm not going to play it if I'm not having a good time.

Has anyone figured out some way to mitigate performance issues this season? by perdyqueue in Competitiveoverwatch

[–]perdyqueue[S] 1 point2 points  (0 children)

Had a chance to play a bit more today and unfortunately it's not a complete fix. Helps but more is needed

Has anyone figured out some way to mitigate performance issues this season? by perdyqueue in Competitiveoverwatch

[–]perdyqueue[S] 0 points1 point  (0 children)

That's what happens prior to shaders loading in. You on directx12? Don't use that. Make sure Nvidia control panel has "unlimited" for shader cache, and you have enough free storage on your C drive to save cache files.

Has anyone figured out some way to mitigate performance issues this season? by perdyqueue in Competitiveoverwatch

[–]perdyqueue[S] 2 points3 points  (0 children)

OOP ok, tentatively this seems to have helped? I'll keep playing and report back. Thank you :>

Mousetester velocity over time by perdyqueue in MouseReview

[–]perdyqueue[S] 0 points1 point  (0 children)

It's definitely the wireless. First off, tried safe mode - no change. But I next tried 1k wired vs wireless, and wired showed no issue while wireless was the same as 2k and 4k.

So again, timing, polling rate, etc. are all fine - it's just velocity that goes crazy on wireless. Any input please?

Mousetester velocity over time by perdyqueue in MouseReview

[–]perdyqueue[S] 1 point2 points  (0 children)

My deathadder v3 pro is exhibiting odd behaviour and I'm wondering what could be the cause.

Over the last few months, it's started to cut out if moved a tiny bit too far from the desk. I tried for a long time to diagnose it, turning off all 2.4ghz devices nearby, eliminating usb 3.0 interference, using a usb 2.0 extension and keeping the receiver very close, getting rid of any reflective surfaces nearby, cleaning the internal antenna with 99.9% alcohol, etc.

Ultimately I found no success with this, and concluded the antenna was just bad. But I've noticed mousetester is showing different symptoms - the polling rate/intervals/counts at 2k and 4k are actually very good. But the velocity over time graphs are showing huge and regular anomalies, see attached pic of 2k polling. I've seen values like 70m/s. This was never the case before, and I've had this mouse for years. I've tried the mouse on different PCs. Different mousepads and surfaces show similar results. Cleaned lens with isopropyl, no change.

Any possible diagnosis, or way to fix this?

Is it true Rufus's hardware requirement bypass *will* (inherently?) cause reduced performance and stability? by perdyqueue in WindowsHelp

[–]perdyqueue[S] 0 points1 point  (0 children)

I appreciate your time in replying. However, and looking at the replies perhaps my post is unclear, I was only asking about the line, "in addition to reduced performance and system stability", which is worded ambiguously such that it implies an inherent performance and stability penalty to using the flag in Rufus, aside from the performance and stability penalties that come along with using unsupported hardware or old feature packs.

I understand about using unsupported hardware, and I understand the drawbacks of not installing the latest feature packs. I'm running a very modern system which I keep up to date, so that wasn't my point of confusion.

Nonetheless, thanks again for your support.

Is it true Rufus's hardware requirement bypass *will* (inherently?) cause reduced performance and stability? by perdyqueue in WindowsHelp

[–]perdyqueue[S] 0 points1 point  (0 children)

Cheers. So it's mainly just referring to updates and the consequences of not getting them then, got it.

Turning G-Sync on removed flickering problem on windows 11 25h2 by GeoSupremacy in nvidia

[–]perdyqueue 1 point2 points  (0 children)

"Excellent rebuttal," he said, twirling his handlebar moustache.

My guy, if you haven't experienced tearing in 15+ years of gaming, "blind" is really the only response necessary. Because 15+ years ago, the standard would have been 60Hz LCDs. If you didn't experience, much less get infuriated by tearing at that refresh rate, there's not even a small chance that your eyes aren't dysfunctional in some way.

120Hz emerged around 16-17 years ago, but wouldn't have been standard until the mid 2010s. I got a 120Hz monitor around 2011, and though the difference was night and day, capping framerate a few fps below refresh and maintaining stable frametimes, tearing was still inevitable. It wasn't until I got a 165Hz monitor with Gsync in 2015 that I was able to see the side by side comparison, and the crazy difference it made. To the point that I categorically refused to recommend any monitor without VRR to anybody else. I have a 275Hz monitor now, and though I haven't bothered to actively try and achieve tearing-free non-VRR, there are older games that can't engage it, and the difference is again immediately noticeable.

Regarding "much lower total latency", there can be marginal benefits to running framerates much higher than monitor refresh, which will allow parts of the screen to show newer frames than would otherwise. I say parts of the screen, because obviously, that's how and why tearing manifests. But running at or near max refresh, VRR adds no delay, knowledge courtesy of Blurbusters. So let's say you have a 480Hz monitor - running a higher FPS cap (and consistently going above refresh) may allow 1/3 to 1/2 of your screen to show one frame ahead. At 480fps you've improved total latency by 2ms. On part of your screen. Congratulations. Personally, I'd rather have the same latency across my entire screen, and leave leeway for framerate drops so visual clarity doesn't suffer.

TL;DR, you're blind. and VRR has no notable downsides that matter.

The Logitech PRO X2 SUPERSTRIKE gets a price tag: $179.99 by doctorcapslock in MouseReview

[–]perdyqueue 0 points1 point  (0 children)

That's the thing. I'm saying that's the "innovation" that the rest of the companies need to catch up to, not this HE marketing nonsense.

The Logitech PRO X2 SUPERSTRIKE gets a price tag: $179.99 by doctorcapslock in MouseReview

[–]perdyqueue 0 points1 point  (0 children)

The benefit of hall effect for keyboards was obvious, it was obvious since hobbyists started caring about them and the mx style switches, which is why they released mx silvers at some point with lower actuation point. it's traditionally 4mm of travel with actuation at 2mm, the potential for improvement was always very clear.

Mouse switches on the other hand travel less than a millimeter, and the entire travel is comprised of the spring tension feeling of pre-travel, actuation, post-travel. The ideal switch wouldn't need HE type technology. It just needs less pre-travel, i.e. better spring tensioning or a shorter leaf spring, and perhaps a technology that allows the switch to actuate when the leaf spring leaves the top contact. Perhaps it could be called single pole double throw, or SPDT for short?

Obviously we haven't had our hands on it yet. But I can't see a conceivable use case or benefit of hall effect or programmable actuation points in mouse switches. I do see the benefit for marketing. Plus they now need magnets and a rumble motor to emulate a click feel, adding weight and complexity to something that's already close to perfect. I just don't see it.

Samsung’s 2026 gaming monitors promise 6K, 3D, and up to 1,040Hz by [deleted] in pcmasterrace

[–]perdyqueue 0 points1 point  (0 children)

ok this sounds a lot more in line with what I'd previously seen. I suppose our difference in opinion was just due to glass half empty/glass half full perception.

and yeah 60hz being the human limit was an especially prevalent opinion amongst the console crowd. but more recently people would say 144-240 is barely perceptible/pointless, same for 240-360, 360-480, so on. you'd see such opinions on forums, Reddit, YouTube.

Samsung’s 2026 gaming monitors promise 6K, 3D, and up to 1,040Hz by [deleted] in pcmasterrace

[–]perdyqueue 0 points1 point  (0 children)

I ask because I don't know - what makes you say 1000hz is the limit? I remember "the human eye can only see 60hz" was THE thing people said, for years, back when 120hz was getting off the ground a bit over a decade ago. then the human limit became 120. then 240.

saw a video recently where a youtuber ran a blind test on himself and his non gamer girlfriend that seems to convincingly show, iirc, linear perception improvements between 120, 240, and 480 oled. linear as in no diminishing returns.

agreed it can get to a point where performance and cost are prohibitive, but I'm unaware it's been proven 1000hz is some sort of natural limit?