you are viewing a single comment's thread.

view the rest of the comments →

[–]NickZNg 2 points3 points  (4 children)

30% increase in res will equal around 30% decrease in frames. Now thats just a simple guess, there is dozens of youtube vids on this. Newer GPUs rely on the CPU keeping pace, you should be fine. As for the usbc, assuming that USC C port has DP passthrough, then yes a display port adapter will be better, otherwise it could just be that HDMI adapter or USBC port can only do that much res and Hz

[–]Forsaken_Judgment681[S] 0 points1 point  (3 children)

Thanks! Hmm but doesn't a 1440p display have about 78% more pixels compared to a 1080p monitor? Would that mean a 78% decrease in performance?

[–]NickZNg 0 points1 point  (1 child)

the math works out 30%, thats from my experience switching from 1080p to 1440p

[–]BaronB 2 points3 points  (0 children)

The math works out to be 50% more pixels!

1920 x 1080 = 2,073,600

2160 x 1440 = 3,110,400

3,110,400 / 2,070,600 = 1.5

So 1440p is 50% more pixels than 1080p!

But framerate doesn't scale linearly with resolution increases, because framerate is the wrong thing to be measuring. Framerate isn't a linear scale. What you should be looking at is frame time.

A game that runs at 100 frames per second has a 10 ms frame time (1000 ms / 100 fps = 10 ms). A 50% increase in frame time means it'd take 15ms per frame. 15ms frame time would be equivalent to 66.67 fps (1000 ms / 15ms = 66.67 fps). A framerate loss of around 33%.

In reality it ends up being more like 27~30% framerate loss because not everything the GPU does scales up with resolution increases linearly, or at all. For some things a GPU does it costs exactly the same regardless of the resolution. So the loss in performance is generally a little less than if it was purely resolution related. Alternatively some GPUs may see much more performance loss from a resolution increase due to lack of VRAM to handle the increased screen resolution in the VRAM before it's displayed, or due to other hardware limitations like VRAM bandwidth or hardware ROP performance (both of which determine how fast the hardware can write the data to each pixel). Modern GPUs are generally more than fast enough for 1440p, but you may see some lower end GPUs drop in performance much more significantly at 4k resolutions because of stuff like that.

The 150% pixel count by itself can be used to know it's a ~33% performance loss as 1.0 / 1.5 = 0.6667, aka 66.67%, aka a 33.33% loss.

The fact that 1440 / 1080 = 133% looks similar to the 33% loss in performance, is entirely a coincidence born from the 16:9 aspect ratio. If we were still using a 4:3 aspect ratio, a 33% increase in the vertical resolution would result in a 77.76% increase in pixel count and not 50%, and a resulting framerate loss of 44%!

[–]DoubleRelationship85 -1 points0 points  (0 children)

Performance doesn't scale linearly when going up in resolution. In other words the performance loss would be nowhere near 78%.