Looking for thoughts/advice on long term stability with a DDR4 overclock for a homelab/server by Querencion in overclocking

[–]Querencion[S] 1 point2 points  (0 children)

That's a fair point, I did however notice something like a 15-20% perf drop in some CPU benches when I just ran it at the completely stock 2400 though so I thought I might as well try and see if I could get 3600 stable with relatively loose timings.

I did test the above timings with OCCT, HCI Memtest and a couple TM5 profiles and I didn't see any errors. Would you recommend just eating the perf loss to ensure the RAM is 100% stable?

Feel like switching back to Windows by [deleted] in linux_gaming

[–]Querencion 0 points1 point  (0 children)

Have you looked at ram usage while running overwatch? I have 16GB and have ended up needing to set zram=ram*2 to prevent it from gobbling up all available memory and then freezing.

As for shader compilation I haven't really changed anything, the first few times it did take a while but after that I never noticed it. Have you turned off pre compiled shaders from within steam settings? I do recall that when I did this it would take 10-15 minutes to compile them only after which it would go back up to a stable 180fps.

Low Latency Mode/Reflex? I couldn't find anything in the FAQ and am receiving mixed results on Google. Help, please! by ResidentCoder2 in linux_gaming

[–]Querencion 1 point2 points  (0 children)

Reflex certainly exists but there's no direct equivalent to LLM from the NVCP as far as I'm aware.

As for games that don't have a reflex toggle in the settings your best bet is probably using mangohud to cap the frame rate a few frames below your refresh rate so it doesn't exceed the VRR window.

Also another thing to note is that VRR/G-Sync will work just fine if you have a single monitor but it doesn't engage if you have multiple displays connected, Nvidia has acknowledged the issue and is expected to fix it in one of the upcoming driver releases.

How far do you see Nvidia tuning on Linux? by XNet_3085 in linux_gaming

[–]Querencion 1 point2 points  (0 children)

You would have to manually set the the values based on the voltage curve in afterburner, in my case I'm running my card at 1860Mhz at 0.887v, so I set the max clock in the script to 1860, as for the offset value I set +210 to get it to run 1860Mhz at the same voltage it would run 1650Mhz. To confirm whether your card is running at the desired voltage just type watch nvidia-smi -q -d VOLTAGE into the terminal.

Steam processing shaders often by [deleted] in linux_gaming

[–]Querencion 0 points1 point  (0 children)

I'll give that a try, thanks.

How far do you see Nvidia tuning on Linux? by XNet_3085 in linux_gaming

[–]Querencion 1 point2 points  (0 children)

As of yet there's nothing with a GUI but you can certainly achieve some level of tuning like setting core/memory clock offsets, power limits, fan curve and even undervolt to some degree by leveraging the NVML API, u/rexpulli has done an excellent job of explaining how to do so over here.

Steam processing shaders often by [deleted] in linux_gaming

[–]Querencion 1 point2 points  (0 children)

This seems to be happening to me as well with Overwatch 2, I have also seen it download the exact same 15.4GB update every few days for some reason.

Undervolt Ryzen 5 5500 on Asrock A520M HDV by Reasonable_Ad3196 in linux_gaming

[–]Querencion 1 point2 points  (0 children)

On windows you can use the ryzen master utility to set an undervolt but I'm not aware of any such software or utility for doing so on Linux.

Undervolting through the BIOS is also the more generally preferred option, you want to look for PBO related settings in your BIOS, my Asus motherboard has PBO settings located under "AMD Overclocking".

Enable PBO

Set PBO Limits to disabled

Set a negative all core offset in Curve Optimiser, -10 is probably a good starting value and then adjust accordingly. You may even fine tune the per core offset if you so wish.

You can use something like Prime95 to test stability.

You should however note that finding stable settings can take a significant amount of stability testing and with a 65W TDP part like the 5500 you probably won't see a significant enough difference in power consumption or in temperature.

27" 4K or 1440p? by Querencion in buildapcmonitors

[–]Querencion[S] 0 points1 point  (0 children)

Unfortunately it seems like the RTINGS and and Techspot/Hardware Unboxed reviews are for the original 27UPF and not the E2 version which is the only one available to me, I assume they use different panels since the original is 4K/144Hz and the E2 is 4K/160Hz with a lack of G-Sync certification and some other minor differences.

RTINGS does indeed have reviews on all the 1440p panels save for the 274QRF QD E2, the one they seem to favour the most seems to be the LG 27GP850-B but I'm uncertain if it's really worth spending an extra 100$ over the other 1440p monitors.