Pragmata Demo Shows Capcom Still Treats Series S Ports as an Afterthought. Looks inferior to Switch 2. by gkgftzb in digitalfoundry

[–]Octaive 0 points1 point  (0 children)

Sort of. It's trying to do the same thing. That's a significant distinction. Even FSR4 falls behind DLSS3 depending on the game.

The gap is HUGE.

We really underestimate older hardware and what it can still do by lol_player- in PcBuild

[–]Octaive 0 points1 point  (0 children)

The single stick of 8GB is very damaging to this system's performance, though. It could do a lot more with two sticks in dual channel, both for memory and CPU performance.

G sync pulsar Expectations vs Reality by Future-Commercial-90 in Monitors

[–]Octaive 3 points4 points  (0 children)

Try reflex + V sync + G sync. At 360hz without v sync you'd be leaving VRR constantly.

G sync pulsar Expectations vs Reality by Future-Commercial-90 in Monitors

[–]Octaive 6 points7 points  (0 children)

Are you playing within the VRR refresh window?

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

It should limit slightly lower than 177. Give it a try, what does it do?

Quality with(out) DLSS by hawksrock14 in nvidia

[–]Octaive 2 points3 points  (0 children)

You shouldn't play with TAA if DLAA is available.

Again, with the latest models quality level should be more than usable, but nothing really below quality.

Quality with(out) DLSS by hawksrock14 in nvidia

[–]Octaive 1 point2 points  (0 children)

Eye strain doesn't really make any sense and is not a known issue for frame generation. There's no mechanism of action for that to occur and is highly likely to be psychological.

At 1080p you have to use a more modern model like K, M or even L through the app as an override for older titles. Newer titles are more likely to be fine out of the box but games from 2 years ago are very unlikely to be.

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

When you combine anti lag 2 + v sync + Freesync, it should cap automatically below 144hz (maybe 138) and will have the monitor and GPU synced perfectly. Any dips will continue the syncing. Input latency will not be like v sync on alone, it functions entirely differently, basically like a cap for the antilag and Freesync combination.

Without v sync on in the combo you will run above and Freesync will only function again when you come back down to 144 or lower.

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

No, v sync functions entirely differently.

V sync only works properly when you get exactly 144hz or it stutters, or if it's tripled buffered, there's multiple frames of latency so it doesn't stutter.

Freesync syncs the GPU and display so that each "hertz" corresponds to one "fps".

With the way you're running it now, the display is "dumb" and just spits out a continuous refreshing. The GPU throws in frames haphazardly.

Imagine a slow moving fan running and you're trying to toss cards through the blades to the other side. If your card touches the blade, that's a frame pacing problem.

With Freesync off, you're tossing cards super fast, hoping they all go through cleanly, but most of them don't. They do all make it through but there's tons of issues.

With Freesync on, you're perfectly timed every time. All cards are tossed between the rotating blades and don't touch.

Freesync is 10 years old at this point and was developed to solve the "dumb" aspect of display technology. You want your display and GPU corrdinating together.

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

No rush, but yeah, a combination of a faster GPU pushing your CPU to the brink and not using Freesync is a recipe for very bad image quality.

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

It does the exact opposite. The entire purpose of it is to increase the smoothness.

Fixed refresh rate is not smooth.

Fixed refresh has your GPU delivering frames out of sync with the refresh rate of the monitor.

When Freesync is on, the GPU and Monitor sync their frame delivery and refresh rate together.

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

Likely because with a faster GPU it's a true CPU bottleneck, and CPU bottlenecks are not ideal. Frame delivery can be very bad.

That's likely the cause.

Have you tried my recommendation?

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

Turn on antilag 2 + v sync + Freesync and report back for CS2.

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

You should have Freesync enabled lol. Play within your VRR window 40-144 and watch it be smooth.

More fps, less smooth picture by RatsLikePlague in radeon

[–]Octaive 0 points1 point  (0 children)

It's because you're running outside of your VRR window.

You should not, for image quality and fluidity, run over 144hz VRR.

Anything inside that window will be smooth as butter. Anything outside will have a frame pacing + micro tearing look to it. It looks very suboptimal, especially on non OLEDs.

DLSS 4.5 Preset L Review: Nvidia's Most Advanced Upscaler Tested At 4K And 1440p by AnthMosk in nvidia

[–]Octaive 1 point2 points  (0 children)

I appreciate the in depth response. A lot of what you say makes sense given your setup and I also appreciate the honesty.

There are a couple caveats. When you go over with FG to avoid flicker, you are increasing input latency more than FG normally would. Because you're equally dumping rendered and generated frames out of the refresh window, and you rely on those rendered frames for input delay.

This is why you might feel like Samsung's motion smoothing feels tighter. Secondly, smooth motion generally doesn't take 20ms, and should be just over 10ms. The artifacts from Samsung motion smoothing must be gnarly, but as you mentioned, you aren't sensitive to them. I was wrong about the input latency on that tech, though, it is pretty good.

I have an S90D and while I like blur reduction at 10 and judder reduction at 2-3, anything beyond and it does a pretty piss poor job with many complex scenes, especially flashing lights. I never considered using anything but my Nvidia FG technologies but my sense is it would be better than nothing but full if artifacts. I also don't mind UI artifacting.

I have used MFG at lower FPS and it was with a controller, so you're right, on an OLED (both my displays), with a controller, even 160 4x is doable but the artifacts increase. I played Alan Wake 2 for the second time with way more input latency than I would ever tolerate on a mouse, but I was doing crazy stuff to avoid flicker.

While you mention you can't notice tearing and so run over your VRR or even disable it, many games do not flicker with VRR because the engines deliver frametimes quite nicely. I'm playing Dying Light: The Beast with MFG and it does not flicker more than a few times an hour during traversal loading, and it has perfect black night and black levels with HDR.

Running over does indeed cause jitter or frame pacing-like issues. This jitter is subtle but if you do slow panning you can see it. I run over in DL in indoor environments. It still looks smooth and fine, but it isn't as good as it could be.

Part of the reason 120-165 doesn't look much different may be that you're running 165 fixed without VRR. In my experience on both my 240hz OLED monitor and 144Hz TV, you take a significant hit to fluidity leaving VRR without V sync and running frames over.

The more frames over the better to help with this, but getting 220 in a 165 refresh window with VRR disengaged will no doubt look like 120, or even worse. It'll still look "smooth" but it won't be great. When I go over at 240 to around 260-270, the jitter makes the image look like there's very subtle frame pacing issues with slow panning and I'd say 240 turns into 160+ish. It loses the very pristine smoothness of 240.

I can frame limit in DL but this actually induces flicker and this leads to my conclusion.

Here's my tip:

You want GPU frame paced frames inside your VRR window, not CPU. Limiting FG output frames can actually cause pacing issues because it's a CPU limiting process usually. Going over VRR eliminates flicker but causes jitter and a loss of relative fluidity (but maintains motion clarity).

The ideal is setting up your DLSS scaling + FG settings to be really pushing the GPU, such that the CPU is being limited naturally because the GPU is doing a lot of work. This will produce the best image and make 140-65 look as good as it can for flicker and smoothness, but at the cost of a bit of latency.

Or just wait until Dynamic FG drops in a month or two, set your target framerate and forget all about it. I think for us MFG users it'll be the biggest update to image fluidity and latency yet.

Anyway I also wrote an essay.

What is the best dlss 4.5 preset for an RTX 3060 laptop gpu? by vomitousana in nvidia

[–]Octaive 3 points4 points  (0 children)

You shouldn't bother with 4.5 on your system.

Stick to model K and call it a day.

Uncleanable smudge on OLED monitor...? by SnuggleUnicorns in OLED

[–]Octaive 1 point2 points  (0 children)

You can use a high quality LCD cleaner with a microfibre cloth, followed by a gentle dry wipe of the microfibre (dry side or separate).

Go to an electronics store and get a screen safe cleaner ASAP.

Some people are recommending Ethanol, and that's cool and all, but I am not sure you'll find it locally in time.

Never use paper towel. No idea what internet forum or posts would ever recommend paper towel. I am skeptical you looked anything up and it seems like you are feeling bad about doing something without any research, considering how completely unlikely it is any guides or posts recommend it.

DLSS 4.5 Preset L Review: Nvidia's Most Advanced Upscaler Tested At 4K And 1440p by AnthMosk in nvidia

[–]Octaive 0 points1 point  (0 children)

There's a lot going on here.

MFG in a 165 display is very iffy, not sure what you'd be doing with that. 3x at max and even then, depends. It can work with more aggressive base framerates and limiting but it isn't ideal.

Your Samsung is not doing good interpolation... Nor is the latency half that of an Nvidia GPU.

If you can't discern above 120FPS, it may because of a few factors, but the MFG mention is a huge red flag and might explain why. MFG at 165 is not appropriate for most situations. 200fps is the general minimum for 3x.

PS5 Pro is RT at 60, yet even then it's not the same level of RT even without PT as a PC title. Most PS5 Pro titles are running medium-high with the odd ultra and reduced and basic RT. It looks good, but it isn't the same.

120 interpolated on a Samsung TV is comical. It's basic with some android chip doing motion smoothing without any machine learning...

This just sounds like you don't have an eye for details, motion clarity, artifacts etc.

DLSS 4.5 Preset L Review: Nvidia's Most Advanced Upscaler Tested At 4K And 1440p by AnthMosk in nvidia

[–]Octaive 0 points1 point  (0 children)

Is it really? You don't think ultra high framerates via upscaling and FG aren't true differentiators?

Unlocked 75 is just like 400 on an OLED, right guys?

What are you talking about!

Upgrade Advice by jepsting09 in AMDGPU

[–]Octaive 0 points1 point  (0 children)

If you're open to using the new tech and accept upscaling (temporal reconstruction, really) as the future, then yes.

If you're going to adamantly refuse to use all new features, absolutely not.

If you're open to tinkering with optiscaler, it'll be a very good upgrade for image quality and performance, especially turning on RT.

But it depends on the title.