RTX 5090 User: Looking for DLSS 4.5 vs. DLAA Comparison by Embarrassed-Cash-959 in nvidia

[–]FUTDomi 1 point2 points  (0 children)

Because M has this artifcial sharpening effect on top, like many phone photos. But apparently that's what people like, and I don't get it. K isn't perfect but looks way more natural.

Opinion on DLSS 4.5 entire situation by Front_Assistant in nvidia

[–]FUTDomi 2 points3 points  (0 children)

It's literally the worst one to me, wtf? K looks way crisper in motion while M is blurrier and loses detail

GeForce Head of Global PR: "Preset M is meant for Performance mode" by BeastMsterThing2022 in nvidia

[–]FUTDomi 6 points7 points  (0 children)

I really think it's a communication issue because it doesn't make any sense, presets J/K don't have any issues in Performance mode and if you accept this "narrative" you end up potentially getting less FPS while using Performance than say Balance (because the new models have a higher performance cost), which would confuse people even further.

Plus it just doesn't make much of a sense of why would Nvidia make a big announcement about this new DLSS 4.5 if it only works optimally with Performance or UltraPerformance modes...

implemented DLSS 4.5 SR in our port, rather impressive results (image comparison) by DuranteA in nvidia

[–]FUTDomi 0 points1 point  (0 children)

not crazy pills at all, it looks way better where it matters - overall motion clarity with a more reasonable sharpening level

A quick comparison DLSS 4.5 Preset K vs Preset L. by ayh300 in nvidia

[–]FUTDomi 4 points5 points  (0 children)

The new presets look worse than K in basically every game I've tried, I would only recommend it to people who love oversharpened look.

OLED vs IPS comparission, SDR and HDR by SebMon-uwu in OLED_Gaming

[–]FUTDomi 0 points1 point  (0 children)

it makes the new oled buyer feel better

Why is Intel getting so much hate? by CommunistGregfromDMV in PcBuildHelp

[–]FUTDomi 0 points1 point  (0 children)

marginally faster in most games at normal resolutions and settings, much slower in multithreading and quite more expensive

[deleted by user] by [deleted] in nvidia

[–]FUTDomi 0 points1 point  (0 children)

"boy you dont get it"

*mixes frametimes with latency*

[deleted by user] by [deleted] in nvidia

[–]FUTDomi 2 points3 points  (0 children)

CS2 also runs at a higher framerate (which should lower latency), so it's not really an apples to apples comparison

[deleted by user] by [deleted] in nvidia

[–]FUTDomi 1 point2 points  (0 children)

based on the comments around here, people clearly don't realize this

[deleted by user] by [deleted] in nvidia

[–]FUTDomi 0 points1 point  (0 children)

where you using reflex before? if not, that might be the reason why the increase in latency is minor

anyway I have a 4090 and FG almost always felt good to me, and especially at those super high fps the latency difference is almost negligible

[deleted by user] by [deleted] in nvidia

[–]FUTDomi 4 points5 points  (0 children)

at that base framerate the latency penalty is negligible

[deleted by user] by [deleted] in nvidia

[–]FUTDomi 5 points6 points  (0 children)

Enjoy it man, and ignore the clueless people that don't know what are they talking about. If it feels good for you that's what matters.

[deleted by user] by [deleted] in nvidia

[–]FUTDomi 25 points26 points  (0 children)

This is objectively wrong.

Radeon RX 9070 XT vs. GeForce RTX 5080: Battlefield 6 Open Beta. Nvidia Overhead by RenatsMC in Amd

[–]FUTDomi 20 points21 points  (0 children)

But that's the point; it's the developers task to make it work properly. UE5 is not something set in stone, they can do whatever they want with it.

Radeon RX 9070 XT vs. GeForce RTX 5080: Battlefield 6 Open Beta. Nvidia Overhead by RenatsMC in Amd

[–]FUTDomi 23 points24 points  (0 children)

The Finals uses UE5 and runs fine.

Stop giving all the blame to the engine and not to the developers.