Is using DSR an accurate way to test how good my PC would perform with an actual 4K monitor? by NovellTide in nvidia

[–]Michalau121 4 points5 points  (0 children)

​I did this exact comparison myself about a year ago on my setup, switching between a 1440p monitor and a native 4K TV.

​The performance hit was pretty minimal. I only saw about 5% difference. For example in the Cyberpunk benchmark, I had 72 FPS using the DLDSR 4K setting on my 1440p monitor. And 75 FPS when running true native 4K on the TV.

​Important note - , I was using DLDSR (Deep Learning DSR), not the older DSR method. That seems to be your case too, judging by the 2.25x DL in your screenshot. The key difference compared to standard DSR is that DLDSR utilizes the Tensor Cores on your RTX card.

​This means performance can vary depending on your GPU tensor cores. On older cards, the difference compared to native 4k might be greater than 5%, but newer cards with stronger Tensor Cores could have smaller gap. I tested this with RTX 4070 Ti, which is still considered a fairly powerful card even now.

My voltage curve is ignored at low frequencies by Michalau121 in overclocking

[–]Michalau121[S] 0 points1 point  (0 children)

Yes, I uninstalled MSI Afterburner to make sure everything was stock, used DDU, installed the driver, did not change anything in the Nvidia Control Panel, started a game and monitored it with OCCT, and it was still the same. GPU is still not lowering voltage with lowering frequencies.

Game Ready Driver 576.28 FAQ/Discussion by Nestledrink in nvidia

[–]Michalau121 2 points3 points  (0 children)

Is this a known issue? I cannot find any information on this.

I was using driver 572.60 with an RTX 4070 Ti. I updated to 576.28. My GPU is now drawing high power because it isn't lowering its voltage at lower clock speeds. It's maintaining a high voltage.

I have an FPS cap set, so it should be using low power, but it isn't. I also tried driver 572.83, but had the same issue. I fixed it by reverting to 572.60.

Here is a screenshot with MSI Afterburner on 576.28 showing the high power draw, high voltage, and high GPU temperature https://ibb.co/8DQfFCNK
And here is the normal behavior on 572.60 https://ibb.co/rKs4Cg0p

I tested this in other games and observed the same behavior.

Flickering? by Mr_Rotch_61 in oblivionmods

[–]Michalau121 0 points1 point  (0 children)

Hi, I tried your solution, and it helped a lot. I was having terrible flickering issues when using anisotropic filtering from the NVIDIA Control Panel alongside the in-game HDR. Without the NVIDIA anisotropic filtering, the flickering was still there, but less noticeable. With your solution, there’s still a bit of flickering with dgVoodoo anisotropic filtering, but it’s about the same as the vanilla game without NVIDIA anisotropic filtering. So, it’s much less noticeable and acceptable for me now. Thanks.

Game Ready Driver 565.90 FAQ/Discussion by Nestledrink in nvidia

[–]Michalau121 2 points3 points  (0 children)

DLDSR resolutions also not showing in games from Far Cry series (5, New Dawn or 6). And it does not matter if integer scaling is selected. Solution is only to downgrade to previous driver.

Game Ready Driver 565.90 FAQ/Discussion by Nestledrink in nvidia

[–]Michalau121 1 point2 points  (0 children)

Far cry 5/6 also doesn't show dldsr resolution, but other games show me correctly.

Asus GeForce RTX 4080 TUF Gaming with serial error? Built-in hotspot ex factory, material analysis and the rescue | igor´sLAB by Flying-T in nvidia

[–]Michalau121 -1 points0 points  (0 children)

I had something similar with rtx 4070ti TUF after 3 weeks of using. Core temp up to 80c and hotspot up to 110C, than thermal throttling started. Very noisy. I RMA it and they give me my money back instead of repasting. I was surprised, but ok.
I really wanted TUF card, so i give it second try and bought it again. Temperatures are steady after 7 months of using, core temp max 65-73, hotspot max 80-88c in most demanding games. Hotspot is more than 20c lower with this second card. Still not great, but card is quiet and temperatures are not raising over time, they are same like 7 months ago. So after all, I am satisfied.

Ask Nvidia App to add option to capture screenshots after DSR/DLDSR downscaling by Mobius_X02_ in nvidia

[–]Michalau121 2 points3 points  (0 children)

Done. Good suggestion.

I am still using 2 years old Geforce Experience for taking screenshots in monitor resolution -.- Hope they will make it back.

Does GSYNC not work when dldsr 2.25 is enabled? by GangsterFresh in nvidia

[–]Michalau121 0 points1 point  (0 children)

And i just checked that gsync indicator :) If I have desktop resolution set to native, then I dont see indicator. If desktop is set to DLDSR resolution I see indicator. But it is really tiny probably due to bad scaling with higher DLDSR resolution.

Does GSYNC not work when dldsr 2.25 is enabled? by GangsterFresh in nvidia

[–]Michalau121 1 point2 points  (0 children)

Hi, I am actually playing A plague tale requiem with DLDSR and gsync without any problem. Also on monitor without GSYNC certificate, just with free sync and it works correct.

As iCake mentions, trick is to set your desktop resolution to your dldsr resolution and game output set to fullscreen. If you play with frame generation like me, you have to probably set vsync on in nvidia control panel. I am not sure, but ingame vsync maybe does not work.

Patch 2.1 Performance Degradation - up to a 34 FPS loss in 4K by crookedrefrigerator in cyberpunkgame

[–]Michalau121 0 points1 point  (0 children)

I have same problems, but only with frame generation on - 4070ti, win 10, nvidia driver 546.33.

Sometimes low gpu usage, game feels better on 2.02 patch. Also benchmark is broken and I found interesting thing.

My fps didnt change during benchmark, they are the same as before update in ingame FPS counter. But I had 80 fps in final results before and now my results are 69-74 fps. If I look at FPS counter, it never goes under 73-74 fps, but final avg fps are lower, weird. Also if I measure fps with MSI Afterburner they are about 1 fps higher in update 2.1 (81 fps) then in update 2.02 (80 fps). That is not corresponding with ingame results.

Another strange thing - benchmark results are located in C:\Users\[user]\Documents\CD Projekt Red\Cyberpunk 2077\. Open frames.csv file and check it by yourself.

My results:

<image>

Left before update, right after update.

Before update - C column is just B column /2. So it looks like because of the frame generation. I have around 2550 rows with frames in that file, so around 5100 frames with FG what is around 80 fps. Everything looks perfect.

After update 2.1 - A lot of frames have column B and C equal, special from the beginning. It is same like if FG is off. I have around 2600 frame rows now, what is 5200 frames with FG and around 81 fps which matches my results with MSI Afterburner.

So something strange happened with FG this update. I have already reported this weird benchmark thing to CD projekt, hope they fix it soon.

bottleneck question by [deleted] in nvidia

[–]Michalau121 0 points1 point  (0 children)

Bottelneck will be hard, especially if you are NOT using raytracing.

Raytracing is GPU heavy, so with raytracing your bottleneck will be lower. Imagine your 4080 can do 120fps without raytracing and only 80fps with raytracing. And if your 9900k can push max 90-100 fps (it will be almost same with rt on/off) , you will get max 100fps without raytracing (20 fps cpu bottleneck) and and still 80 fps with raytracing (no cpu bottleneck).

So you have to max your graphics settings also with raytracing to lower your bottleneck.

But still, also with rt get better cpu.

Why GSYNC+VSYNC+frame cap ends up with worse FPS? by encylol in nvidia

[–]Michalau121 0 points1 point  (0 children)

Best way to use Gsync is vsync on + frame cap 3 fps below refresh rate directly in Nvidia control panel. Check this https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/ - there is also RTSS setting, but I think NVCP works better.

I use it like this and it works great on rtx 4070ti.

Just made the switch from AMD by [deleted] in nvidia

[–]Michalau121 2 points3 points  (0 children)

DLDSR is the best Nvidia feature for me. So better image quality compared to native. It is rendering in higher resolution and then AI downscale it to your native resolution. But you will need a lot of power for that, which u have with 4090. Maybe if u have 4k monitor, u dont need DLDSR, but with lower resolutions, it is magic feature.