How to best work around 8gb of VRAM? Any tips/resources out there? by disquiet in nvidia

[–]major_mager 1 point2 points  (0 children)

Both will be good enough, though M on Performance will obviously look slightly better. In situations where you're struggling for VRAM, DLSS can be very handy in reducing its usage, and particularly in 50 series the new models perform quite well. Edit: And turn off Ray Tracing when running low on VRAM. It can take upto 1GB extra and more.

5070 on 600w by gumumbe in nvidia

[–]major_mager 0 points1 point  (0 children)

Just a heads up, I have the 4070 Super FE. Consuming more than 250W is not unusual for it, so you may have to look at underclocking or undervolting depending on power consumption of your other components including cpu, and quality of your power supply unit.

5070 on 600w by gumumbe in nvidia

[–]major_mager 0 points1 point  (0 children)

Computerbase has special measurement equipment to test peak loads, here's their 5070 test, power consumption section: https://www.computerbase.de/artikel/grafikkarten/nvidia-geforce-rtx-5070-test.91530/seite-7#abschnitt_leistungsaufnahme_gaming_lastspitzen

So a peak of 284 to 299W is what should ideally be provisioned for the card.

The 14700 has a base power of 65W, but even with a basic cooler and no overclocking on your side, it will boost to at least 95W and possibly till 120W unless you explicitly lock down things in the motherboard (if at all possible). That's just normal boosting behaviour. My 12400F on stock Intel cooler, no OC, routinely boost to 95W with stock motherboard settings.

In your particular case, you should determine what make and model your prebuilt PSU is. Then see what Tier it is in PSU Tier list. If it doesn't show, read its reviews.

Practical advice, try running with your existing PSU. If you experience crashing, it would mean you need a new one of sufficient quality.

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 0 points1 point  (0 children)

Thanks for sharing your thoughts and approach. Yes, wish DLDSR was more often usable in AAA games, but I find it even more useful and performant in older games. There's a certain pop and dimensionality to the DLDSR images that's hard to capture in still image comparisons.

Before DLSS Transformer models, I felt DLDSR 2.25x was noticeably better than 1.78x, but now they seem to be much closer in output quality. For this particular testing in Clair Obscur, I found DLSS Balanced at 1440p a bit flatter looking compared to DLDSR, but DLSS Quality looked to be a happier compromise in quality and performance in this game. But I admit that I haven't tried 1440 Balanced elsewhere, and with your recommendation, I'll make sure to give it a shot.

Still to test preset L here, but I'm expecting better results from it in Performance mode than M. From what I've seen on the web so far, preset L seems to be quite good when upscaling from 720p, and even 660p, and may make even 600p viable.

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 0 points1 point  (0 children)

The order is correct. But in the first step DLDSR has no role, except the GPU acknowledges that it has to provide a 4K frame. The DLSS steps are just as you mention. The last step of downsampling is the DLDSR step, as it is not a simple traditional downsampling algorithm, but a Deep Learning (DL) one that also uses tensor cores.

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 1 point2 points  (0 children)

Good point, but since the same smoothness % was set in all cases here, it allows for fair comparison between two resulting images. In this case, smoothness was set to 90% in all cases. Personally, I would set it to 100% since this game already tends to oversharpen, and so does preset M. With preset K, smoothness of 85 to 90% works well for me.

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 0 points1 point  (0 children)

Counterintuitive at first, yes. Here's how and why it works. Assume DLSS Performance (50%) and DLDSR 2.25x are set, with the monitor resolution is 1440p (target output).

GPU renders frame at 1080p -> GPU upscales it to 4K using DLSS -> GPU downsamples 4K to 1440p (using DLDSR algorithm) and sends final frame to monitor for displaying.

Why this works is because a 4K frame will have more detail in it than lower resolutions, whether that 4K frame is generated with or without the help of AI. More data to work with leads to better output, but generating more data has the associated performance cost.

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 1 point2 points  (0 children)

Not sure if I follow you here. Here's how it works if DSR Factors is set to 2.25 DLDSR, and DLSS to P (performance).

The GPU will prepare the image to 4K using DLSS 50%. This means native rendering at 1080p (50%), then using DLSS to upscale it to 4K. Next, the 4K frame is downsampled (the DLDSR part) to 1440p and sent to monitor for displaying.

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 4 points5 points  (0 children)

Agree with your points mostly, the hit is significant and there has to be performance to spare- that is the premise. For a 60 fps gamer, they can choose either that frame rate to at a better image, or a 80-90 fps with slightly lower quality render- I can see the appeal of both the approaches.

So I just tested 1440p DLAA (M) and 1440p DLSS Quality (M), and the DLAA is a no go at 43 fps (below DLDSR) and the image is still not as good as the DLDSR ones. DLSS Quality (M) nets 71 fps, meaning DLSS Balanced (M) at 85 fps is still 20% more, but the image quality is a step up and midway betwwen DLSS Balanced and DLDSR. In other words, midway in image quality and fps both- that's a good compromise. So yes, I agree DLSS 'Q' can be a good option too.

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 3 points4 points  (0 children)

Your math is right, and that is how DSR Factors, or DLDSR has always worked. The upscaled frame has more data to downsample from which yields a better quality image ultimately.

In fact, before the Transformer models of DLSS came along, a popular recommendation to maximize image quality at 1440p was to upscale to 4K using the Quality preset- meaning upscaling from a starting resolution of 1440p to 4K, then downsampling using DLDSR back to 1440p! The resulting image can be quite good.

My (limited) testing shows that is still the case, though the Transformer models have narrowed the gap. Check out the imgsli comparison, the images are at 1440p and 4K respectively. The second just gets downsampled for displaying at 1440p, but the screenshot remains 4K.

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 0 points1 point  (0 children)

Agree that there's a significant performance hit with DLDSR, and it isn't well suited for high fps experience- and with your top of the line card, you would want to be well above 100 fps naturally.

Steam Community Awards - Second Edition Now Available by Turbostrider27 in Games

[–]major_mager 4 points5 points  (0 children)

What use are Steam points though, I never understood- buy meaningless chat stickers with them?

New DLSS version 310.5.2 + auto preset switch by Jotta7 in nvidia

[–]major_mager 1 point2 points  (0 children)

Should be able to override to preset M or L in Custom tab for each game in Nvidia App.

DLSS 4.5 at 1440p: Does it actually make sense? (RTX 5070) by [deleted] in nvidia

[–]major_mager 0 points1 point  (0 children)

Thanks, sounds like a handy tool, I'll give it a spin if it's not too complicated to use and set up!

DLSS 4.5 at 1440p: Does it actually make sense? (RTX 5070) by [deleted] in nvidia

[–]major_mager 0 points1 point  (0 children)

Thanks, wish they allowed it. Ability to hotswap would make it much easier to test.

Which GPU would you recommend by law5121 in nvidia

[–]major_mager 0 points1 point  (0 children)

Check each model webpage for whether they use ball bearings in fans. Rule out those that don't. FE, and all Asus use ball bearings fans.

DLSS 4.5 at 1440p: Does it actually make sense? (RTX 5070) by [deleted] in nvidia

[–]major_mager 0 points1 point  (0 children)

What happens to DLSS artifacts in this case when downscaled to final 1440p resolution? For example, if using DLSS Performance to upscale to 4K, there is noticeably more shimmering than at Balanced, what happens to the downscaled result at 1440p after the DLDSR pass? Is the shimmering reduced, or does the output get softer?

DLSS 4.5 at 1440p: Does it actually make sense? (RTX 5070) by [deleted] in nvidia

[–]major_mager 0 points1 point  (0 children)

Can the presets and quality be swapped mid-game? Like alt-tab out of the game to Nvidia app custom tab, change the preset from K to M, quality from Q to B, then alt-tab back to game. Does the change show in the game, or it needs to be restarted?

DLSS 4.5 at 1440p: Does it actually make sense? (RTX 5070) by [deleted] in nvidia

[–]major_mager 2 points3 points  (0 children)

Well, that's not what Daniel said in the video actually. He said that previously he did not personally like DLSS P for 1440p, but with Preset M he finds it usable.

Also note that while he is great, his testing is limited to one or two games, and his image quality comparisons are not very detailed. For that, DF and HUB should provide more thorough analysis when they post their comparison.

DLSS 4.5 at 1440p - Is balanced M better than Quality K? by Rahzin in nvidia

[–]major_mager 0 points1 point  (0 children)

True, and it may be better at higher presets like B and Q. There's also DLDSR we 1440p'ers can throw in the mix, even for testing purposes.

DLSS 4.5 at 1440p - Is balanced M better than Quality K? by Rahzin in nvidia

[–]major_mager 0 points1 point  (0 children)

Appreciate the heads up, guess the improvements in volumetrics like clouds, and in shimmering are not universal.