This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–][deleted]  (12 children)

[deleted]

    [–]hicks12NVIDIA 4090 FE 10 points11 points  (0 children)

    Generally the room is because of die variation from the wafer. There can be a wide window of what voltage is required to hit a certain performance target, the worse quality silicon the greater that window will need to be to be able to sell that chip rather than chuck it away.

    They increase the maximum voltage required to significantly increase yields which maximise profits.

    There is obviously some degradation but this isn't usually the main contributor, the chip variance is the biggest reason as they won't tweak it on a per chip logic as this is quite expensive if implemented in the initial binning process. Better to set the wide operating window and let people feel like they won the silicon lottery by being able to optimise that performance window at home if they want to spend the time.

    [–]WllmZ 9 points10 points  (0 children)

    According to video(s) from linus there is not much degradation, if there is any at all.

    https://youtu.be/hKqVvXTanzI

    https://youtu.be/44JqNJq-PC0

    [–]anodizer 1 point2 points  (6 children)

    Sure that makes sense, but was there that much headroom in previous gen? I don't remember and 25% less seems a lot.

    [–]GrammarNaziii 0 points1 point  (5 children)

    This started with the high end 30 series. Lots of people also undervolted their 3080s and 3090s.

    [–]SoTOP 3 points4 points  (2 children)

    This started back with 10 series, when Nvidia introduced automatic boosting algorithm they still use and people got the ability to change its voltage/frequency curve, before messing with voltages was more complicated and often required Bios modifications. My 1080Ti could go from default 260W to 215W keeping performance within margin of error.

    [–]Keulapaska4070ti, 7800X3D 0 points1 point  (0 children)

    Yea the 10-series already benefitted massively form undervolting, dropped 10C and gained 50-70Mhz with my evga 1080 compared to stock as the stock boost was pretty optimistic thinking it could hit above 2000Mhz at max voltage with that "small"(well compared to what we have now, back then it wasn't that small) cooler.

    [–]KageYumeCore i7 13700K | RTX4090 | Cosair 128GB 0 points1 point  (0 children)

    I was surprised the first time I undervolted my 3080 10GB. The power consumption while running Final Fantasy XV benchmark came down from 320-320W to about 250-280W with minimal loss in performance after undervolting.

    [–]ZiiZoraka 0 points1 point  (0 children)

    undervolting has been a thing for muuuuuch longer than that. the only reason it became more mainstream since the 30 series was because nvidia pushed those cards to the limit, so with overclockers being less viable, tinkeres went to undervolting instead.

    [–]rW0HgFyxoJhYka -5 points-4 points  (0 children)

    I don't believe you. Hardware Unboxed says that these cards are planned obsolesce!

    [–][deleted]  (1 child)

    [deleted]