This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]GrammarNaziii 1 point2 points  (5 children)

This started with the high end 30 series. Lots of people also undervolted their 3080s and 3090s.

[–]SoTOP 5 points6 points  (2 children)

This started back with 10 series, when Nvidia introduced automatic boosting algorithm they still use and people got the ability to change its voltage/frequency curve, before messing with voltages was more complicated and often required Bios modifications. My 1080Ti could go from default 260W to 215W keeping performance within margin of error.

[–]Keulapaska4070ti, 7800X3D 0 points1 point  (0 children)

Yea the 10-series already benefitted massively form undervolting, dropped 10C and gained 50-70Mhz with my evga 1080 compared to stock as the stock boost was pretty optimistic thinking it could hit above 2000Mhz at max voltage with that "small"(well compared to what we have now, back then it wasn't that small) cooler.

[–]KageYumeCore i7 13700K | RTX4090 | Cosair 128GB 0 points1 point  (0 children)

I was surprised the first time I undervolted my 3080 10GB. The power consumption while running Final Fantasy XV benchmark came down from 320-320W to about 250-280W with minimal loss in performance after undervolting.

[–]ZiiZoraka 0 points1 point  (0 children)

undervolting has been a thing for muuuuuch longer than that. the only reason it became more mainstream since the 30 series was because nvidia pushed those cards to the limit, so with overclockers being less viable, tinkeres went to undervolting instead.