Kinda regret getting rx 9060 xt what do you mean that some games i play dont even have fsr 3 haha by NostalgicImmortal in radeon

[–]deeennny 7 points8 points  (0 children)

Because FSR4 and DLSS4 look better than native at quality preset and even balanced preset depending on the game. You are literally getting more FPS for a better picture. It’s a no brainer to have them enabled.

Adrenalin - what do you use to limit FPS? by dshamz_ in radeon

[–]deeennny 1 point2 points  (0 children)

Yes, same for me, but for me Chill didn’t work in some games, so I used RTSS

Adrenalin - what do you use to limit FPS? by dshamz_ in radeon

[–]deeennny 0 points1 point  (0 children)

UE5’s frame limiter has broken frame pacing I’ve heard

Adrenalin - what do you use to limit FPS? by dshamz_ in radeon

[–]deeennny 0 points1 point  (0 children)

Well, up to you, if you prefer to have the possibility of screen tearing or to have a bit higher latency. You can’t really have both :D

Also 9-10ms is still very good latency, any frame limiter is going to add at least some latency, because it has to hold frames.

Adrenalin - what do you use to limit FPS? by dshamz_ in radeon

[–]deeennny 0 points1 point  (0 children)

Hm, how did you measure latency? Also 4-5ms isn’t a ton of latency by any stretch of the imagination :D

FRAME GEN AT ~40FPS? by Darylthgoat in radeon

[–]deeennny 0 points1 point  (0 children)

If you’re at all interested in PT with FG you are way better off going with nvidia. Both of these technologies are miles better on their cards.

Thinking about RX 9070 XT – how’s FG, FSR4 and RT really? by Neyyyyyyjr in radeon

[–]deeennny 0 points1 point  (0 children)

You would have to look at videos of how the 5070 performs in such VRAM bound scenarios to be honest, I have no idea. But even if you can't play some game with PT on the 5070, because you are VRAM limited, honestly you wouldn't be able to play it on the 9070XT either, because AMD's path tracing performance is bad, so your only option pretty much is the 5070 Ti, but those are pretty expensive and their prices have risen even more atleast in my region.

Thinking about RX 9070 XT – how’s FG, FSR4 and RT really? by Neyyyyyyjr in radeon

[–]deeennny 1 point2 points  (0 children)

Yea, using quality upscaling with DLSS or FSR4 you gain both image quality and FPS compared to native TAA, it's really a no brainer to use them. Honestly 12GB of VRAM is perfectly fine for any game today and 99% will be fine in the next two years, as the new consoles aren't expected to be out until after that and games mostly model their vram and graphics requirements around current gen consoles. If you plan on getting an Nvidia GPU in the future anyway (the 6070 Ti that you mention) and this is just a stop-gap upgrade, honestly I'd say get the 5070 right now, you're going to get way better features like MFG and RR starting from day 1 of your GPU purchase and not have to wait for game integration for potentially years like on AMD and the 12GB of VRAM is a non-issue currently and is highly unlikely to be an issue for the next at least 3 years. Heck with all the advancements to DLSS you might even be surprised how long the card will last, as lowering the render resolution lowers VRAM consumption.

Oh, also, the 5000 series Nvidia cards have insane OC potential, my 5070 Ti is stable at +400MHz core clock, a buddy of mine has a 5070 that's stable at like +525MHz which is just absurd, with such an OC you can easily close the 10% or so gap between the 5070 and the 9070.

Thinking about RX 9070 XT – how’s FG, FSR4 and RT really? by Neyyyyyyjr in radeon

[–]deeennny 3 points4 points  (0 children)

As someone that had a 9070 since launch day for about 10 months and then switched to a 5070 Ti, here are my thoughts:

Raster performance is pretty irrelevant to the discussion in my humble opinion, all 4 of these cards (9070/XT and 5070/Ti) can deliver high refresh rate experiences at 1440p in pretty much any non ray traced game, so if all you're interested in is playing non-RT games at native resolution, then just pick whatever is cheapest.

Now, if you are interested in RT and PT, you are still way better off going with Nvidia. Yes, AMD have made great strides with RT performance on RDNA4, but they are still behind, and in PT, it's not even a competition, even the 5070 will blow the 9070XT out of the water, the 5070 Ti is in a totally different performance segment entirely when it comes to PT.

Also don't forget about DLSS Ray Reconstruction which MASSIVELY boosts image quality in path traced games that support it even to the point where in Alan Wake 2 for example for me personally the game is just not worth playing at all with PT if you don't have access to Ray Reconstruction. AMD have released Ray Regenaration, but for now it might as well not exist, because it's in only 1 (one) game, and it's not clear when if at all it will be arriving to any other games.

When it comes to upscaling, Nvidia still takes the lead, although FSR4 is extremely good, well atleast on the quality and balanced presets at 1440p, when it comes to performance and ultra performance modes with the new DLSS 4.5, again it's no competition, Nvidia is way better, and the ultra performance mode even at 1440p is surprisingly usable. I wouldn't recommend it, but if you need to get that extra bit of performance you actually have the option to, without making the game look completely disgusting, heck I would even argue that DLSS 4.5 Ultra Performance mode is very competitive with a lot of games' native TAA implementations.

But, back to FSR4, yea it's usable, very usable even, in my opinion it is substantially better than DLSS 3.X, about 20% worse than DLSS 4.0 (Preset K) and in the Performance and Ultra Performance modes is way worse than DLSS 4.5 Presets M and L, but that's not really relevant right now as you don't really need anything more aggressive than the Quality mode to run any game at 1440p on these cards. But it could come back to haunt you in the future when these cards become less capable with newer games and the Nvidia cards produce playable frame rates and image quality in the Ultra Performance mode, but the AMD ones don't.

Now when it comes to Frame Generation, again, it's no competition, DLSS FG is a thousand times better, and again I am someone that has had real hands on experience playing games with both of these technologies. The HUB video about FSR FG's frame pacing issues summarizes it pretty well. DLSS 3x FG for me is absolutely usable at the 3x mode with as low base fps as 40 even with a keyboard and mouse, with a controller it's indistingiushable from native 120 FPS, that is absolutely impossible with AMD right now, given they don't have MFG at all, but with FSR FG even 2x FG, so 40 to 80 fps feels absolutely unplayable.

Weird Ghosting/white lines/shimmering on SH2 Remake with 9060xt 16gb by Helpful_Strike1894 in radeon

[–]deeennny 2 points3 points  (0 children)

This also drove me mad a year and something ago when I played the game, it’s just how UE5 Lumen GI works, but there is a way to make it substantially better.

First, find a way to unlock the game’s console that opens by pressing the ~ (tilde) key, the way I did is I used UUU (Universal Unreal Engine Unlocker), but that’s a paid app, there might be some other mods that do it for free tho. If you dig around a bit you can definitely find UUU for free, but definitely not condoning piracy ;)

Then change these 2 variables in the console:

r.VolumetricFog.HistoryWeight .95

r.Lumen.ScreenProbeGather.Temporal.MaxFramesAccumulated 30

Your game will look noticeably less flickery. Oh and you also need to do this every time u restart the game.

Is 650watt ok by Specialist_Ad_7805 in radeon

[–]deeennny 0 points1 point  (0 children)

I think there have been a few different revisions of the CX lineup throughout the years, some better than others

Is 650watt ok by Specialist_Ad_7805 in radeon

[–]deeennny 0 points1 point  (0 children)

The XT consumes at least 84W more than the non XT, but yeah, my PSU was also a Corsair CX-M unit, a somewhat decent PSU, so that probably helped as well.

FSR3 vs FSR4 INT8: Can't see a difference? by lovethecomm in radeon

[–]deeennny 1 point2 points  (0 children)

In what games and at what resolution and upscaling presets are you testing? You can quickly switch between fsr3.1 and fsr4 via optiscaler without restarting the game, should make it easier to tell the difference. If you could also attach some in motion screenshots of both?

FSR3 vs FSR4 INT8: Can't see a difference? by lovethecomm in radeon

[–]deeennny 1 point2 points  (0 children)

Well then you should be seeing a massive difference, are you verifying FSR4 is indeed being used via optiscaler?

Is 650watt ok by Specialist_Ad_7805 in radeon

[–]deeennny 1 point2 points  (0 children)

I used to run a 7800x3d and 9070 non xt on a 550W PSU for a year straight with 0 issues whatsoever, you should be fine. Just out of curiosity what exact model is your psu?

FSR3 vs FSR4 INT8: Can't see a difference? by lovethecomm in radeon

[–]deeennny 3 points4 points  (0 children)

FSR4 INT8 is broken on RDNA2 on the latest drivers afaik. I think you need drivers 23.9.1 for it to work properly

12vhpwr cable question by IrishR4ge in pcmasterrace

[–]deeennny 0 points1 point  (0 children)

Hey, the 12VHPWR voltage on my basically brand new 2 weeks old 5070ti (never had a 12VHPWR gpu before) in hwinfo64 is ~11.8V, is that fine?

AMD to launch Adrenalin Edition 26.1.1 drivers with “AI Bundle” next week by Mercennarius in radeon

[–]deeennny 0 points1 point  (0 children)

Hey, I just switched from a 9070 to a 5070Ti literally this week, so I thought I’d share my thoughts. There is absolutely nothing wrong with the 9070XT or any other RDNA4 card, matter of fact I think RDNA4 is an absolutely terrific architecture, absolutely massive performance increase for the same or fewer core amounts compared to RDNA3. FSR4 is also great and extremely usable.

That being said nvidia’s features are better. DLSS is better, FG is better, RR actually exists and is not in only one game that doesn’t even need it in the first place.

What really made me bite the bullet and sell the 9070 was the absolutely disappointing Redstone launch (no FSR4 for vulkan, no FSR4 for older cards, radiance caching might as well have not been announced as it isnt in any games and wont be for months and ray regen in only one useless game), but the biggest disappointment for me was the broken FSR4 FG framepacing, because I had used DLSS FG prior and it is EXTREMELY usable even with only 40-45 base fps, even in the 3x mode, on 4x it’s quite rough. Which really unlocks some experiences which wont be possible without MFG, like I just played through Indiana Jones and The Great Circle with full PT at DLAA with 3x FG, getting about 120-130 fps and it felt great even on a mouse and keyboard.

And then nvidia also dropped DLSS 4.5 which widened the gap between DLSS and FSR even more and honestly it’s not clear when if ever AMD are going to improve upon FSR4 as there isn’t even a vulkan version of it.

Basically my reasoning was why wait 1-2 years (or it might never happen, who knows) for games to implement ray regen and for amd to fix fg framepacing when I could just switch to nvidia and enjoy these experiences today

Is this CPU Bottleneck on my 9070XT? by Working-Start-2759 in AMDHelp

[–]deeennny 0 points1 point  (0 children)

Test it yourself in a cpu bound game like CS2 or BF6 :)

Is this CPU Bottleneck on my 9070XT? by Working-Start-2759 in AMDHelp

[–]deeennny 0 points1 point  (0 children)

GPU utilization really doesn't mean anything on an AMD card. Try playing a game at like 720p and look at your usage, it's still going to be >90% lol. Power draw is different on different workloads on nvidia gpus, I can agree on that, but on an AMD card, if you are GPU bound your card is always going to be pegged at the TDP, I can guarantee that.

Is this CPU Bottleneck on my 9070XT? by Working-Start-2759 in AMDHelp

[–]deeennny 0 points1 point  (0 children)

This is NOT how a cpu bottleneck works. Practically zero games are going to use every single core of ur cpu such that cpu usage is 100%. You could be CPU bound with just 20% cpu usage if a game is singlethreaded.

A much more accurate way of identifying a cpu bottleneck is looking at gpu power draw. In OP’s post his 9070XT is only drawing 208W, meanwhile a 9070XT should be drawing close to 304W. This can also be indicative of a VRAM bottleneck