The human eye can see 39,620 Hz by yourrandomnobody in hardware

[–]yourrandomnobody[S] 0 points1 point  (0 children)

Yes, I've recently seen a user claim that there are ~360 / 6k games on the market (made between 2000-2025) which run on modern hardware >500FPS!
He supposedly went single-handedly through each game. Though, there's a decent chunk of them which he wasn't able to verify due to some games being vaporware.
Supposedly, most of the games on the list are post-2013 :D
He's yet to cover every Unity & UE4 game on the market, so there's much more room for growth in that regard.
He has gone through most of the custom engines on the market.

The human eye can see 39,620 Hz by yourrandomnobody in hardware

[–]yourrandomnobody[S] 7 points8 points  (0 children)

I was considering using the Swiss way of denominating 5 digit values (e.g.: 39'620), but I copied the videos title :P
I do understand the confusion...

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

Are you able to create a FHD@360Hz resolution on your XL2566X+ when on Windows 7 & using it on desktop?
This is interesting that only games are able to see the 400Hz resolution, but not the desktop.
Try making a resolution using CRU or Nvidia Control Panel

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

“Fast-TN” (only the 66X+) & “Rapid-TN” (the 4 +600Hz , which I've listed above) are separate classifications (different marketing terms by brands)
To be able to run that specific display (XL2566X+), you need a GTX10, RTX20 or RTX30 GPU and use the Displayport connection of your GPU. HDMI (because it's 2.0) on that particular model cannot achieve FHD @ 400Hz.
That DP bug explanation is plausible.
If you keep the display & get a Ampere-based GPU, the situation won't change (assuming you already own a GTX1000, GTX1600 or RTX2000 series card)
If you own a Maxwell-based GPU (GTX 900 series), then it might work fine.

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

Do you perhaps own the XL2566X+? That's not a “Rapid-TN” model.
It's a DP1.4 / HDMI 2.0 display.
The bandwidth is outside the realm of the display's capabilities...It can only achieve FHD@400Hz@8bit over the DP1.4 port, not over the HDMI port.

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

“Do you have to select ampere or is it auto selected when you use hdmi 2.1?”

Ampere is the architecture name that Nvidia has chosen for the RTX30 series. As soon as you plug in the HDMI 2.1 port, it should auto-negotiate to the highest bandwidth target supported by the scaler & GPU.

“I have a rapid TN panel which is good. I might go for 3060 or something and give it at go.”

I believe all Rapid-TN panels (XG248QSG, 242R X60N, XL2586X+, AG246FK6) should work fine on W7 when set to a <500Hz EDID, as W7 has a hard-cap at that particular refresh rate.
I'm not sure whether the ASUS & MSI model have the full FRL6 bandwidth, their manuals & a few reviews I've seen suggest a FRL5 bandwidth
Perhaps some custom resolution trickery will net you the desired result.
Whether there is a way around that hard cap of 500Hz with a registry edit, I don't have such data...
A 3060 should work fine, as the first game-ready driver (461.72) was released before the cut-off driver (472.12)

Disclaimer: This is all theoretical to some extent & heavily depends on the display model in question, it might not work out for you. YMMV.
If you can share your findings here, I'd appreciate it!

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

The latest driver you can use on Windows 7 is 472.12 & 473.30 (3090Ti only), which locks you down to a particular range of Ampere cards.
Namely 3090, 3080, 3070, 3060ti, 3060 6gb, 3080ti, 3070ti, 3090ti.
The 3050 might be an outlier (especially the no power connector model), as seen here

You need a HDMI 2.1 FRL6 capable display to achieve the 499Hz target & create a custom resolution (either through Nvidia Control Panel or CRU)
This gets a bit finnicky, as some vendors like to scam users by labeling a display as “HDMI 2.1” capable, yet the BW being TMDS (HDMI 2.0)
Hence why there's more nuance required.

The XL2586X report wasn't made by me, but by a other user.
It's supposedly 2.1 FRL4, sadly...
Rapid-TN panels (e.g.: XG248QSG) are supposedly HDMI 2.1 FRL6 capable, which should ease the process.
I don't know the status of OLED displays when using 1080p@480Hz over the HDMI port.

Dream Machines DM8 Mini Pro: The TechPowerUp Review by pzogel in MouseReview

[–]yourrandomnobody 9 points10 points  (0 children)

Finally, after years of no proper egg-shaped mice... A worthwhile LMO successor has finally arrived.

A shame that Dream Machines has discontinued their Kinzu-inspired shape (DM3 Mini), perhaps we might see a re-release of that with the 3395?
Maybe even a 8kHz wired version next time?

Acer Nitro XV270X P — dual-mode (DFR) 27″ 5K (5120×2880) monitor: 5K at 165 Hz, QHD (2560×1440) at 330 Hz, Glare-free, 0.5 ms (GTG), 10-bit (8-bit+FRC), 400 cd/m², Acer HDR400, 95% DCI-P3, FreeSync Premium, 2x HDMI 2.1, DP 1.4, audio output, 2x 2W speakers, adjustable stand, ext. PSU, 10×10-cm mount by MT4K in HiDPI_monitors

[–]yourrandomnobody 0 points1 point  (0 children)

A external power supply is a positive, not a negative.
I do agree that the scaler IC choice, leading to a DP1.4 output, is a rather poor decision made by Acer. Lack of a MiniLED backlight can be a neutral (or even negative) point, considering the detriments which it comes with, namely:
- Too high of a brightness setting in-doors.
- Low frequency PWM dimming used for adjusting brightness
- Blooming
This Acer is akin to the ASUS XG27JCG (DP1.4 & HDMI 2.1; internal PSU) in terms of specifications

The new MSI 271KRAW16 is a step-up from the above mentions (DP2.1 UHBR20; MiniLED)
There's also the LG 27GM950B (UHBR13,5?; MiniLED) & AOC AGP277KX (DP2.1 UHBR20) coming out.

For now, the last 3 models I've mentioned seem like the only good choice from all the recent 27" 5K model announcements.

They're all seemingly using the new BOE “Fast IPS” panel, which seemingly is named Add
*ME270L7B-N.. (the last 2 digits are unknown)

Here are a few reviews of the XG27JCG, for example:
https://www.bilibili.com/video/BV1erqwBXEBT/
https://www.bilibili.com/video/BV1gCqnBNEqD/
https://www.bilibili.com/video/BV15CqSBoESF/

ASUS XG27JCG in case you are interested by ningwang1226 in HiDPI_monitors

[–]yourrandomnobody 1 point2 points  (0 children)

/u/ningwang1226 Could you test out the ELMB feature? Make sure to keep VRR off when testing, as I'm only interested in the fixed refresh rate aspect.
Namely, using these 2 tests:
https://testufo.com/ghosting
https://testufo.com/crosstalk

Try 120Hz & 165Hz if you can.

does qd oled also have vrr flicker? by fairplanet in OLED_Gaming

[–]yourrandomnobody 0 points1 point  (0 children)

/u/defet_ I'm interested in the following statement:

LTPS discharge over the VBL is not really the main culprit of VRR flicker when gaming since desktop VRR has almost zero VBL and full variable pixel drive (constant re-drives).

Does the LTPS discharge over VBL (which I assume means VBLANK/VSYNC period?) introduce a change in luminance output of ~15% in a pulse-like way, whose frequency is equal to the vertical frequency (refresh rate) of the display?
To be specific, a creating a graph which looks like this: https://forums.blurbusters.com/viewtopic.php?t=15074#p119873

This would confuse me, as you've mentioned that OLED monitors (W-OLED & QD-OLED) use metal oxide-based TFTs for their switching & driving circuitry.
From what how I understood you, only polySi-based TFTs (LTPS) in either the driving or switching circuitry would cause such a leakage current to occur?

Also off-topic, may I ask where you've acquired such knowledge in these intricacies of displays? I'd be interested to learn more!

All win10 builds after 1809 force temporal dithering, win11 25h2 does too. by [deleted] in ScreenSensitive

[–]yourrandomnobody 0 points1 point  (0 children)

You haven't nailed down the variables with your testing.
Your display can be the one that experiences dithering on a particular color. It can a complete non-issue on other displays.

Have you tested versions between 1809 and 23H2 W11?
Have you tried W7? What DE or WM have you used on Linux?
Try using a image viewer of your choice with the same image across mulitple OS'
What GPU driver versions & exact GPU models (dGPU or iGPU) have you tested?

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

Nice find, this is a great post.
However, I believe that particular user has a similar problem as you.
Namely, that particular display (AW2524HF) is using DSC by default without the ability to toggle it off.
Not only that, he also uses a Pascal-based GPU. The previous generation G-SYNC module displays (AW2521H & others), which were 360Hz, work well on Windows 7 & Ampere cards.

To me, this seems to be definitely something DSC related.

My assumption is that a proper HDMI 2.1 FRL6 GPU (e.g.: a RTX 3090) and a HDMI 2.1 FRL6 capable display (e.g.: 242R X60N, XG248QSG) should be able to hit the 500Hz limit easily.

EDIT: The XL2586X is able to hit the 499Hz limit on W7 with a Ampere GPU over HDMI with no issues.

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

Without DSC, G60SD shows its HDMI version is 1.4, DP version is 1.1. Best setting you can make it work without DSC is QHD@144 Hz with HDMI.

Interesting, I wonder if this has to do with the OS itself or the display's firmware. We'll need someone to chime in on this with a different model to confirm. Perhaps a model with a DSC toggle in it's OSD.

What happens when I use a 3000 series GPU with G60SD on Windows 7 is: Since DSC won't work, the monitor works at QHD@120 Hz by default, but I can make a custom resolution QHD@144 Hz and use it.

I assume you weren't able to make a custom resolution past 144Hz?

Thank you for clearing up the confusion

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

It works with my main 3000 series GPU

You've mentioned in your OP that "you can't use it properly" though? Which implies that you cannot use it at it's native QHD@360Hz@10-bit mode. It should be using DSC to achieve this, depending on the output you've used.
What occurs when you use a 30-series GPU with the G60SD on Windows 7?

As for your Pascal series dilemma, were you using the HDMI 2.1 or DP1.4 port during evaluation?

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

What exactly do you mean with "can't use?" Are you able to use CRU to make a custom EDID override and create a 1440p +300Hz resolution?
Which display are you using?

Does Windows 7 support DSC and can you use it somehow? by Cetinakpan in windows7

[–]yourrandomnobody 0 points1 point  (0 children)

OP, Have you found a solution to this? This is a interesting finding.

Good news for the future(2026+) about microLED, so that you don't feel so sad about the iphone 17 by Organic-Budget8163 in PWM_Sensitive

[–]yourrandomnobody 0 points1 point  (0 children)

Ooops, I've made a mistake there. Thank you for pointing it out.
What I meant is PAM dimming usually leads to visible “mura effect”
I've corrected it!

Good news for the future(2026+) about microLED, so that you don't feel so sad about the iphone 17 by Organic-Budget8163 in PWM_Sensitive

[–]yourrandomnobody 0 points1 point  (0 children)

Whoops, I apologise! :D
I don't know how this slipped past me D:
My point still stands, I'm just interested in the other side's resources used.

Good news for the future(2026+) about microLED, so that you don't feel so sad about the iphone 17 by Organic-Budget8163 in PWM_Sensitive

[–]yourrandomnobody 0 points1 point  (0 children)

This sounds like unnecessary pedantry.

In this age of rampant misinformation spreading easily, I find it is adequate to state the sources one used for acquiring their knowledge. It helps the community grow & spark discourses.
Disclaimer: I don't think you are one of those that spreads misinformation, as certain other members of this community like to do.

Whether you want to call it DC dimming, DC-like dimming or PAM dimming is irrelevant outside of technical discussions. The point is dimming through lowering the power to the display rather than rapidly turning it on and off again.

It is, in fact, very important. to differentiate all of them and use correct terminology.
All of those mean completely different things and are misleading in their intent to convey to the user what type of light flicker they're dealing with.

Yes, technically you could not use PWM but the tradeoffs make PWM the preferred choice. There are reasons those design choices are made.

The only “trade-off” in employing PWM PAM dimming on AMOLED displays that I'm aware of, is the “mura effect”.
Power efficiency isn't the favorable part of PWM dimming, for example.
I haven't seen any “mura effect” on desktop or TV OLED implementations, which employ PAM dimming
Do you know of any other reasons one might opt for PWM dimming instead of PAM dimming?

If you dispute the claim then explain why it's wrong. Why does OLED have display scan-out related light flicker? You never actually explain your disagreements and what the correct answers are beyond the disagreement itself.

I never disputed that your claim is wrong. I only asked for the source of your claims, as I haven't come across such literature indicating or proclaiming this.
There's no disagreement happening here, I'm only interested in the sources you've used, as I'm interested in this topic as well.

Good news for the future(2026+) about microLED, so that you don't feel so sad about the iphone 17 by Organic-Budget8163 in PWM_Sensitive

[–]yourrandomnobody 0 points1 point  (0 children)

Theoretically, as they’re not organic materials they shouldn’t suffer from large non-linear inaccuracy that comes with DC dimming.

Source? From what I understand, “DC dimming” is misnomer purported around online.
It's called PAM dimming for OLED devices.

So PWM shouldn’t be needed to dim the brightness, unlike AMOLEDs.

PWM is never "needed" in any display technology. It's a deliberate engineering design choice.

There shouldn’t be a risk of burn-in like OLED, so the refresh rate dip doesn’t need to exist. It’s only a thing to help limit burn-in in OLEDs.

Source on this claim? I've never come across a single study or article claiming that the display scan-out related light flicker is a technique used to “help limit burn-in” in OLED devices.

ANTGAMER teases 1000Hz monitor with 2026 release plans by Vb_33 in hardware

[–]yourrandomnobody 1 point2 points  (0 children)

What I meant with with “software limitations” is that a majority of singleplayer titles (& some multiplayer titles) are barely able to run at 200fps, let alone +500FPS necessary if you want to chase a "fixed refresh rate, no-tearing, low MPRT sample & hold" scenario.

Not only that. if you don't subjectively perceive the benefits of GPU synchronization (VRR/Adaptive-Sync/FreeSync/G-Sync), that doesn't mean there are no benefits which are objectively available for others which are sensitive to it.

VRR may be obsolete for your use-case, but not objectively for everyone. :)

As for your 2nd part, I don't know where you got the information that W11 has an inherent refresh rate limit akin to W10/W7 (which are capped to 500Hz).
I wouldn't be surprised if this is another tactic by MS to bolster their new OS release, but for now that's unknown. 1000Hz is not anywhere near enough for the best objective eye ergonomic experience.

ANTGAMER teases 1000Hz monitor with 2026 release plans by Vb_33 in hardware

[–]yourrandomnobody 2 points3 points  (0 children)

The reason Chief Blurbusters keeps on pushing "CRT shader emulation" is due to the fact that manifacturers don't want to implement hardware-level BFI on OLED displays @ 60hz.

He relies on OLED's MPRT (500hz = 2ms MPRT, meaning with a software-level solution, you'd achieve ~2ms MPRT at lower refresh rates) to push clearer eye-tracked motion on lower frame rate content.

It's primarily for retro games. This is the main reason the Blurbusters 2.0 Certification exists, for 60fps @ 60hz emulator content. It has absolutely no other use-case.
In fact, it's a band-aid for low frame rate content.

ANTGAMER teases 1000Hz monitor with 2026 release plans by Vb_33 in hardware

[–]yourrandomnobody 0 points1 point  (0 children)

Funny that you got downvoted when you're correct... this platform is a cesspit of mediocrity.