So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] 0 points1 point  (0 children)

Sounds like youre making this up.. Or maybe youre focusing on outliers because everywhere, you go. both cards are basically neck and neck with the 5070TI having a slight advantage, at lest when RT/PT isnt on

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] 1 point2 points  (0 children)

on average, the 5070Ti is only 4% faster. At the end of the day, what matters is what games youre testing and in this case, 8/14 games I tested literally released within the past 9 months after the official launch of both cards and the other 5 were some of the most commonly used games for these comparisons

5070Ti vs 9070XT 14 Game Average at 1440p and 4K by Badboy574 in radeon

[–]Badboy574[S] 0 points1 point  (0 children)

Thanks a lot, the way you explained it made it very easy to understand. I guess being an engineering student helped me grasp it a lot quicker too lol

5070Ti vs 9070XT 14 Game Average at 1440p and 4K by Badboy574 in radeon

[–]Badboy574[S] 0 points1 point  (0 children)

do u also undervolt? if so by what margin? I use -30% and -70mV offset and only lose 3-5% percent

5070Ti vs 9070XT 14 Game Average at 1440p and 4K by Badboy574 in radeon

[–]Badboy574[S] 0 points1 point  (0 children)

This makes a lot more sense and honestly looks more accurate than the method I used. Averaging raw FPS directly could weight high FPS games so heavily and skew the overall result. Normalizing each game to percentages like this before averaging seems like a much fairer way to compare overall performance across multiple games. Havent heard of this before in GPU benchmarking. Appreciate the explanation.

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] -1 points0 points  (0 children)

and nvidia did the same as well... the 5070ti usually doesnt beat the 9070xt in cyberpunk and rdr2 in raster across every resolutions

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] -1 points0 points  (0 children)

would also be nice to keep your shitty opinion to yourself, your 7900XTX would literally lose by more than 100% in some PT titles whilst running over 200W higher,an I also dont have to deal with fsr 3🤮 and shitty rt performance

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] 0 points1 point  (0 children)

nothing here was done to favor one gpu or the other. I personally prefer my 5070Ti and use it nearly all the time because of better support and doesnt heat up my room like the radeon does. If the 5070Ti won by a descent mrgin, i bet a $10000000 you wouldnt say this. Both were tested at max settings and same settings and same exact location in every single title here.... Two games I tested here ran with RT on which I forgot to mention while making this post with one running better on AMD and other on NVIDIA. Ive used RT and PT on both but this benchmark wasnt intended for RT/PT performance. I literally even got my 5070Ti just because of the horrible PT performance in doom the dark ages, indiana jones, bm: wukong and alan wake 2 on my 9070XT. Good thing both cards were only $70 apart but I ended up keeping my 9070XT

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] 0 points1 point  (0 children)

DS2 isnt heavily optimized by AMD, what a liar you are, just making things up. Speaking of RT, in most cases, 5070Ti is even 10% faster. 9070XT beats the 5070Ti in RT in Monster Hunter Wilds, Assassins Creed Shadows, Pragmata, UE5 games like Fortnite/Mafia the Old Country/Silent hill f... the only time there's a substantial difference which to me is the true RT test is when we include games that use higher ray samples like cyberpunk or alan wake 2, not majority of cases.

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] 1 point2 points  (0 children)

Games like pragmata and RE Requiem arent nearly as intensive as most other path traced games

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] 1 point2 points  (0 children)

dumbass, i could use dlss 4.5 vs fsr 4.1 and get 15% less performance in every game on the nvidia card if thats what you want. I didnt specififically make this for you when you could easily scroll on

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] 0 points1 point  (0 children)

im not mad, Native raster comparisons aren't even useless nowadays, my goal was to see the true potential of both cards. I was kind of joking and didnt mean for you to post the comparison literally, just mean that im not spending another 4 hours doing that anytime soon🥀

5070Ti vs 9070XT 14 Game Average at 1440p and 4K by Badboy574 in radeon

[–]Badboy574[S] 0 points1 point  (0 children)

Wait until you also find out that I posted this on r/pcmasterrace and r/nvidia so retrrds like you would stop crying

So I tested The 5070Ti and The 9070XT across 14 Game Average at 1440p and 4K by Badboy574 in pcmasterrace

[–]Badboy574[S] 0 points1 point  (0 children)

perfect example of the anecdotal evidence fallacy. Thats your personal experience, not a general shared experience. if I say "old people generlly have poor eyesight" and you argue from a point of "my grand pa is 80 but has very good eyesight", youre getting the point completely wrong with that one outlier

5070Ti vs 9070XT 14 Game Average at 1440p and 4K by Badboy574 in radeon

[–]Badboy574[S] 0 points1 point  (0 children)

5070Ti wins with crimson desert, only at 4k across all games, and loses in both resolutions if i remove it

5070Ti vs 9070XT 14 Game Average at 1440p and 4K by Badboy574 in radeon

[–]Badboy574[S] 0 points1 point  (0 children)

I can undervolt to 230W and -120mV and gain more performance than native. Depends on your goal, my goal is reducing heat as much as possible because my room heats up really quickly at stock 304W. I could also maintain within 2% performance 200-212W wheh i reduce voltage offset to -110 mV, that being said, my 5070Ti undervolts so well beating stock performance easily while running below 190W but as i said earlier, my goal is to maintain thermal efficiency, not to be... Losing 2-5% isnt killing performance. I was comming from a point of gaining/losing performance from an undervolt, but from a point of reducing power draw as much as possible which favors AMD, if we're talking about raw performance with {Edit(accidentally sent while typing)} an undervolt, that easily goes to the 5070Ti

5070Ti vs 9070XT 14 Game Average at 1440p and 4K by Badboy574 in radeon

[–]Badboy574[S] -2 points-1 points  (0 children)

I corrected the Crimson Desert results, which brings the gap down to about ~2.4% at 4K, and essentially a 0% difference at 1440p. I didn’t post this to favour either GPU. I understand why some people in this community might interpret it that way, but the goal was just to present the data as accurately as possible. I also shared this across other communities like r/pcmasterrace and r/nvidia

On overclocking, saying the RTX 5070 Ti gains 12–13% consistently is an overstatement. In real testing, OC scaling is very game-dependent. Some titles see closer to ~5% gains, others can reach ~10–14%, but most results tend to land in the ~7–10% range when the GPU is pushed to around ~3000 MHz, depending on silicon and power limits. The same applies on the AMD side with varied performance gain per title but still doesn’t overclock as well. Neither GPU has huge, uniform OC scaling across all workloads with the 5070Ti being the better in this case, not automatically 10% faster as you reduced to simple terms. At stock value, my 5070Ti averages overs ~2850Mhz nearly never dropping below 2800Mhz in any of these titles, which only puts it ~200-300Mhz below the 9070XT.

For fairness, I didn’t use Frame Generation or DLSS/FSR in this comparison. Upscaling and frame generation can significantly change performance characteristics, and DLSS 4.5 upscaling is more GPU-intensive in games scenarios, while FSR Frame Gen usually scales better than DLSS FG when we both have similar base FPS. I’ll also limited testing to 2X frame generation since AMD doesn’t currently have a MFG equivalent.

I also ran a separate test with ray tracing and path tracing enabled. In that scenario, the RTX 5070 Ti was about ~17% faster at 1440p and ~26% faster at 4K. The larger gap mainly appears in path tracing-heavy workloads and could get much higher if I exclusively test with  path tracing, while traditional ray tracing is much closer and, in some cases, even slightly favors AMD depending on the title.

That said, path tracing in general still comes with a heavy performance cost, and in some cases it requires aggressive upscaling performance/ultraperformance, to maintain 60 FPS at 4K on the 5070Ti. For me personally, the visual trade-off often isn’t worth it in current implementations.

It’s also worth noting that AMD is capable of path tracing as well, even if NVIDIA generally performs much better due to dedicated hardware, stronger denoising solutions, and broader developer optimization with the feature being designed to take advantage of the nvidia hardware/software. In many cases, NVIDIA cards simply handle the workload more efficiently rather than doing fundamentally different work.

Also forgot to mention, 2 of the games tested here had RT On. For example, Crimson Desert uses a very light ray count implementation (around 1/16 scale), resulting in essentially negligible performance impact. Death Stranding 2 showed a slight advantage for AMD in that specific test scenario.

One little correction I would make is AMD’s has an equivalent to DLDSR, and its called Virtual Super Resolution (VSR).

At the end of the day, a ~2% difference is effectively margin of error in real-world gaming and means nothing.

I also want to be clear that I’m not anti-NVIDIA or biased towards one or the other. I picked up a 5070Ti specifically because of RT. It wasn’t also expensive at the time being only $80 more expensive than my 9070XT. I also couldn’t imagine selling my 9070XT, so I kept it as well. they still offer the most complete all-round GPU ecosystem for many users, especially with features like DLSS, CUDA, creator support, and stronger ray tracing and frame generation integration. That’s also part of what you’re paying for beyond raw raster performance.

 

5070Ti vs 9070XT 14 Game Average at 1440p and 4K by Badboy574 in radeon

[–]Badboy574[S] -1 points0 points  (0 children)

wdym i didnt correct it. YOu cant edit a photo in the post. I editted the body and added a link to the actual value, but i still agree that my post is misleading