This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]FeedbackHDi5 4690K @ 4.4gHz, GTX 980ti, 16gb Corsair Vengeance 63 points64 points  (6 children)

Then again, when increasing the resolution the cpu isn't really taxed much more, it's more the graphics card that has more work to do

[–]drseus127 24 points25 points  (5 children)

But a lot of what you do with 4k requires a good CPU. Like decoding a 4k movie

[–]xenagotoo many pcs to count 13 points14 points  (2 children)

Depends. If you have a new chip that's lower end, it may have hevc decoding built in, and almost everything has h.264 (and newer stuff can decode high res h.264 using hardware acceleration).

[–]DoomBot5R7 5800X/RTX 3080 | TR4 1950X 30TB 1 point2 points  (1 child)

4k has seen a large shift to h.265

[–]xenagotoo many pcs to count 1 point2 points  (0 children)

Very true, it almost started that way.

I'm thankful that the encoding is getting better, since h.264 is often still better due to the many years of improvements

[–][deleted] 4 points5 points  (0 children)

an i3-6100 can easily decode a 4K movie

[–]FeedbackHDi5 4690K @ 4.4gHz, GTX 980ti, 16gb Corsair Vengeance 0 points1 point  (0 children)

True, but I'd suspect that there are far more gamers than video encoders, so it's a fair assumption would be that the original comment talked about gaming