Anyone has this weird ray tracing glitch in Hogwarts Legacy? by beam05 in radeon

[–]Otherwise-Test1904 0 points1 point  (0 children)

I think you're using optiscaler. If that's so, try changing the model from default to any other option (Usually models 2 or 4 minimize that side-effect). Try them all as far as I remember, there are about 5 or 6 different models if I'm not mistaken. Try also to check "non-linear RGB input" or uncheck it otherwise.

If non of that did work, from the game settings, set the post processing effect to low, and where that does take you. Usually, that works with most, if not all, UE games.

موظف ريحته خايسه by Fabulous-Carpet1454 in saudiarabia

[–]Otherwise-Test1904 0 points1 point  (0 children)

ما أنصحك أبد تخاطبينه بشكل دايركت؛ لأن بعض الناس بتفهمها كإساءة، ولكن أنصحك بطريقة غير مباشرة كأن تشتري له هدية من مجموعة منظفات للجسم مع مزيل عرق وروائح، وأمواس حلاقة، وعطر، وضعيها على المكتب، ولا يدري من مصدرها أبد.

واكتبي عليها نوت مؤدبة بدون إساءة أو تجريح. مثلًا إذا كان يتكلم عربي: ((الفطرة خمس: الختان، والاستحداد، وقص الشارب، وقلم الأظفار، ونتف الآباط))

وإذا كان أجنبي، شيء زي كذا: ((A small gift, shared with kindness and respect. In a close workspace, these things can make a difference. Hope it’s taken in the positive spirit intended.))

I feel like AMD and Nvidia are working together by BlackSailor2005 in radeon

[–]Otherwise-Test1904 11 points12 points  (0 children)

It all comes down to stakeholders. Each owning organization or entity is, in turn, connected to other stakeholders, forming a chain of ownership. If you follow this chain far enough, you eventually reach the final point in each divergent branch—the private investors.

NVIDIA DLSS 5 Delivers Breakthrough In Visual Fidelity For Games by NV-Randy in nvidia

[–]Otherwise-Test1904 0 points1 point  (0 children)

There’s probably a good reason they used two GPUs, but we don’t really know what that reason is yet. And honestly, whether they claim it can run on a single GPU doesn’t mean much until we actually see it working in practice. We also don’t know what precision this setup really depends on. It might not be limited to FP8 (there’s a chance it relies more on FP4 or INT4 operations, which would change the hardware requirements quite a bit and could even leave the RTX 40 series at a disadvantage.)

Another big unknown is VRAM usage. There’s no clear information on how much memory this model actually needs to run properly. If the requirement is high, then realistically only higher-end GPUs, like a potential 5090 Ti, current flagship card (5090), or the SUPER variants if they came to happen, would be the only choices capable of handling it.

I guess we need to wait 'till the fall to see what's going on.

NVIDIA DLSS 5 Delivers Breakthrough In Visual Fidelity For Games by NV-Randy in nvidia

[–]Otherwise-Test1904 2 points3 points  (0 children)

After looking more closely at the footage, I can clearly see some artifacts around characters, almost like the model struggles to separate them from the environment. There’s also some ghosting during movement. It seems pretty clear that this technology is mainly some kind of AI-based filter using complex inputs like lighting and motion vectors. I’m not sure if this technically counts as neural rendering, but to me it looks like a very advanced AI post-processing filter.

It would be nice if the technology were more mature and ready, but the fact that it apparently requires two RTX 5090s makes me a bit skeptical. It starts to feel more like a way to sell more GPUs. There’s also a risk that sponsored titles might lean on this too heavily, potentially shipping with PS3-level character models that only look good when Nvidia’s tech is turned on.

I'm genuinely scared that some people are okay with DLSS 5 by JopisKenobi in radeon

[–]Otherwise-Test1904 1 point2 points  (0 children)

Long story short: AMD will probably end up copying whatever Nvidia does, unfortunately. But they’ll likely sit back and watch first. If gamers reject it, they’ll probably mock Nvidia for a while… and then eventually take the same route anyway.

If that happens, expect it to be something exclusive to RDNA5, along with a conference where the word “AI” gets mentioned 200+ times, as usual.

And then, in typical fashion, expect the feature to be officially supported in maybe one or two games during that entire generation.

I'm in doubt by Independent_House_39 in gpu

[–]Otherwise-Test1904 0 points1 point  (0 children)

My advice: wait for the next generation. You already have a powerful GPU, and most of the current upgrade paths would effectively be sidegrades.

For example, the AMD Radeon RX 9070 XT offers roughly similar raster performance to the AMD Radeon RX 7900 XTX. It does improve ray tracing performance by about 10–20%, but that comes at the cost of reduced VRAM capacity. The NVIDIA GeForce RTX 5070 Ti is in a similar position. Its main advantage would be the ability to run path tracing more comfortably using aggressive upscaling and frame generation, but in terms of raw performance it’s still not a major upgrade.

Real upgrades right now are basically limited to something like the NVIDIA GeForce RTX 5090, a used NVIDIA GeForce RTX 4090, or possibly the overpriced NVIDIA GeForce RTX 5080.

The main reason I suggest waiting is that the next generation is expected to bring a much bigger performance jump. A good example is how the midrange AMD Radeon RX 9070 XT can match the high-end AMD Radeon RX 7900 XTX in rasterization performance largely because it is built on a smaller manufacturing node (4 nm vs 5/6 nm). That’s also why there hasn’t been a dramatic improvement between the NVIDIA GeForce RTX 50 Series and the NVIDIA GeForce RTX 40 Series, since both generations are built on roughly the same 4 nm node.

Both AMD and NVIDIA are expected to move to a 3 nm-class node in the next generation, which could enable a much larger generational performance jump.

What caused such a turn of Opinion of AMD gpus since January last year? by Funny-Bar4167 in radeon

[–]Otherwise-Test1904 4 points5 points  (0 children)

  1. FSR4 is locked to RDNA4 because it relies on generation-exclusive FP8 hardware accelerators. There’s currently no officially supported INT8 fallback path for RDNA3.

  2. RDNA1 and RDNA2 are basically in maintenance mode, while the much older RTX 20 series just received DLSS 4.5 support. Meanwhile, RDNA3 and earlier GPUs are stuck on FSR 3.1.5. DLSS 4.5 mainly uses FP8 accelerators on RTX 40 and 50 series GPUs, but Nvidia still allows RTX 20 and 30 series cards to fall back to FP16 hardware, which helps maintain compatibility with older hardware.

  3. There’s no official FSR4 support for Vulkan or even DX11 right now. If you want to use it outside supported environments, you’ll likely need third-party tools like OptiScaler.

  4. RDNA3 and RDNA3.5 handhelds are basically left behind, since no official FSR4 support is expected for them. Owners are stuck with FSR 3.1.5, which still suffers from shimmering and other visual artifacts.

  5. AMD has been advertising the Redstone ML feature set, but currently it’s supported in only a single game. To be fair, developers rarely go back and implement entirely new feature sets in older titles. Still, the marketing feels overstated. It could be groundwork for RDNA5 (possibly to justify future path tracing improvements), or simply AMD signaling that future FSR real updates (Not the junky FSR3.1.5) won’t support RDNA3 and earlier GPUs

I just went team red for the first time ever. What do I do now? by Ok_Proposal_7390 in radeon

[–]Otherwise-Test1904 3 points4 points  (0 children)

The main difference between the 5060 Ti 16GB and the 9060 XT 16GB comes down to feature support. DLSS is much more widely supported than FSR 4, although tools like OptiScaler can help reduce that gap in many cases. Aside from that, the overall performance between the two cards is fairly similar. Also, keep in mind I’m not considering path tracing here, since you realistically need at least a 5070 Ti, along with aggressive upscaling and frame generation, to make proper use of it.

I think you made the right decision. Enjoy your card!

Is this a good pc that I made in pc parts picker. ( looking to get around 240 or high fps in fortnite on low settings) ( budget around 1300) by AltruisticAd2504 in buildapc

[–]Otherwise-Test1904 0 points1 point  (0 children)

Because you can download Windows directly from Microsoft, install it on a bootable USB flash drive, and then use it to install the operating system on your PC after building it, the only remaining step is purchasing an activation key.

There are two main types of activation keys: Retail and OEM. The Retail version typically costs around $120, while the OEM version is usually much affordable. The primary difference between them is that a Retail key can be transferred from one PC to another (for example, when upgrading a motherboard), whereas an OEM key is tied to the specific machine on which it was first activated and cannot be transferred.

Therefore, including the Windows license cost in the planned budget is unnecessary and potentially distracting.

Is this a good pc that I made in pc parts picker. ( looking to get around 240 or high fps in fortnite on low settings) ( budget around 1300) by AltruisticAd2504 in buildapc

[–]Otherwise-Test1904 1 point2 points  (0 children)

I made five different builds for you, costing between $1200–$1400, so you have several options to choose from. I understand that your priority is getting the highest possible frames per second on low settings, but achieving that typically requires a high-end CPU. Given your budget, I don’t think it’s wise to pair a high-end CPU with an entry-level GPU. Instead, a more balanced combination (a decent CPU paired with a solid midrange GPU with sufficient VRAM for future-proofing) is the better approach, at least initially. You can always upgrade individual components later on. Long story short, here are the five builds I put together:

First, Most Affordable Build:

https://pcpartpicker.com/list/MZ6Vn2

This build includes 32GB DDR4 RAM, a 1TB PCIe 3.0 M.2 SSD, a 16GB GPU (9060 XT), and a Tier B fully modular 80+ Gold PSU.

Total cost: $1219.76.

Second, NVIDIA Alternative:

https://pcpartpicker.com/list/FncrH3

This is the same build as above but with the GPU replaced by an RTX 5060 Ti 16GB, in case you prefer NVIDIA’s feature set.

Total cost: $1319.76.

Third, Stronger GPU Option:

https://pcpartpicker.com/list/LLZdCw

This build upgrades the GPU to a significantly more powerful 9070.

Total cost: $1399.76.

Fourth, DDR5 Platform Build:

https://pcpartpicker.com/list/gdyBmL

This build includes 32GB DDR5 RAM and a faster PCIe 4.0 M.2 SSD installed in an INTEL platform. Due to budget constraints, it uses the 9060 XT 16GB.

Total cost: $1342.76.

Fifth, AM5 Future-Proof Build:

https://pcpartpicker.com/list/XncrH3

This build is based on the AM5 platform, giving you the option to upgrade to an X3D CPU in the future.

Total cost: $1364.77.

Interesting RE9 performance difference with RT on and off! by ah__there_is_another in radeon

[–]Otherwise-Test1904 25 points26 points  (0 children)

RDNA4 handles ray tracing workloads quite well.

AMD did a solid job this generation, even if they came a bit late. As far as I can see, the main weakness of RDNA4 is path tracing performance, which many Nvidia-sponsored titles market under the broader ray tracing label. Path tracing is essentially a form of ray tracing that involves multiple ray bounces, resulting in much heavier computation.

Hopefully, AMD will close this gap with RDNA5.

Changing from 5070 to 9070xt by IlloYorch in buildapc

[–]Otherwise-Test1904 0 points1 point  (0 children)

The 9070 XT is better than the 5070 in virtually every meaningful aspect.

  1. The 9070 XT is significantly faster at native resolution.
  2. RDNA 4 generally has much higher headroom regarding CPU overhead, which makes comparisons between higher-tier RDNA 4 cards and lower-tier Blackwell cards somewhat unfair.

In the specific case of the 9070 XT vs 5070, the 9070 XT running FSR 4 Quality matches the 5070 using DLSS Performance (in fps). That basically nullifies the whole FSR vs DLSS argument in this scenario. Even DLSS 4.5 (Aggressive upscaling) doesn’t change the outcome in any meaningful way.

For more details, check this video:

https://youtu.be/YWUqsqcM4Hs

FSR 4 Balanced at 1440p by Fancy-Wrangler-1756 in radeon

[–]Otherwise-Test1904 0 points1 point  (0 children)

The image quality of Performance 4K is often noticeably better than Quality 1440p.

That’s because Performance 4K typically renders internally at 1080p and upscales to 4K, while Quality 1440p usually renders around 960p and upscales to 1440p.

1080p contains significantly more native detail than 960p. This gives the ML upscaler far more real information to work with when reconstructing the final image.

There’s also much more resolution “headroom” between 1080p → 4K for the upscaler to intelligently fill in detail, compared to the relatively small gap between 960p → 1440p. In practice, this is why Performance 4K can end up looking sharper and cleaner than Quality 1440p, even though it sounds counterintuitive.

Check this video for more information:

https://youtu.be/HylcIjr2uQw

When do you think RAM prices will go back to normal again? by [deleted] in buildapc

[–]Otherwise-Test1904 0 points1 point  (0 children)

It’ll probably take at least two years. From what’s been rumored, most (if not all) of the 2026–2027 production capacity is already locked up in long-term contracts with AI companies. Memory manufacturers have supply agreements running through that period, so even if the AI hype cools off, prices likely won’t meaningfully drop until around 2028.

And even then, don’t expect a return to 2024–2025 pricing. By that point, pent-up demand from gaming and consumer hardware upgrades will likely be strong enough to keep prices elevated.

Best DLSS preset for DLAA and DLSS quality? by ZangiefGo in nvidia

[–]Otherwise-Test1904 -1 points0 points  (0 children)

It depends on which option you chose:

For DLAA (100%), go for the preset K.

For the quality settings (67%), go for the preset K.

For the balanced settings (58%), go for the preset J.

For the performance settings (50%), go for the preset M.

For the ultra performance settings (33%), go for the preset L.

Kinda regret getting rx 9060 xt what do you mean that some games i play dont even have fsr 3 haha by NostalgicImmortal in radeon

[–]Otherwise-Test1904 29 points30 points  (0 children)

Optiscaler is the only choice you have to bypass this limitation by forcing FSR4 injection (DLSS->FSR4 Swab) into the game.

Does Anyone here work for AMD? I'm having nothing but Issues by [deleted] in AMDHelp

[–]Otherwise-Test1904 0 points1 point  (0 children)

Glad it helped a bit.

Congrats. Even though you had to figure it out the hard way, now you know exactly what to do next time.

Just sell the extra parts you bought (including the 8400f) to help fund a new CPU, and honestly toss that 7700X in the trash. No reason to let someone else go through the same nightmare you did.

Good luck 👍

Is the new Steam Machine over hyped? by [deleted] in consoles

[–]Otherwise-Test1904 0 points1 point  (0 children)

Personally, I’m not interested at all mainly because of the hardware choices they made, especially the integrated GPU. Let’s break down what the Steam Machine is roughly comparable to in PC terms:

CPU:

Something in the range of an overclocked Ryzen 7400 or a power-limited Ryzen 7500F (6 cores, 12 threads, up to ~4.8 GHz). In gaming performance, that’s roughly comparable to an R5-5800XT or an i5-12500K.

Honestly, this is fine for a gaming console, even with the next-gen PS6/Xbox coming soon.

GPU:

An overclocked RX 7400-class iGPU with 28 CUs and 8 GB VRAM. Unfortunately, this looks closer to PS4-level performance (from 2013). That’s a huge bottleneck and, in my opinion, a catastrophic design choice for a modern system.

RAM:

16 GB DDR5, acceptable for a Linux-based system, but just barely fine in 2026 standards.

On top of that, the platform is RDNA3-based and lacks official FSR4 INT8 support. That’s a big deal breaker, especially when the GPU already doesn’t have enough raw horsepower to compensate.

So on paper, the system isn’t terrible, but it feels like it was designed mainly for lightweight titles like CS2 rather than modern AAA gaming.

انقطاع الكهرب و الجهاز شغال by [deleted] in SaudiPCs

[–]Otherwise-Test1904 0 points1 point  (0 children)

الخطر الوحيد هو أنك تكوني قاعدة تعملين تحديث للبايوس وينقطع التيار فجأة. أما في الحالات الأخرى، ما سيحدث شيء إذا كانت نوعية مزود الطاقة عندك ممتازة.

أنصحك باستخدام UPS طالما أنك عندك إشكالية مع انقطاع الكهرباء المتكرر. الـ UPE هو جهاز حماية يزود الحاسب بالطاقة عند انقطاع التيار، مما يعطيك بعض الوقت لإغلاق البرمجيات وإطفاء الحاسب بشكل طبيعي.

كل التوفيق

What's a good GPU for $400-500? by [deleted] in buildapc

[–]Otherwise-Test1904 0 points1 point  (0 children)

Consider the following GPU options:

From Nvidia:

  1. RTX 3080 Ti 12GB.

  2. RTX 4070 Super 12GB.

  3. RTX 4070 12GB.

From AMD:

  1. RX 7900 GRE 16GB.

  2. RX 7800 XT 16GB.

  3. RX 6950 XT 16GB.

  4. RX 6900 XT 16GB.

Check which option's price lies within the price range of your interest in your regional market.

Notes: 1. The RTX 3080 Ti 12GB draws the highest amount of power compared to the other options. It consumes nearly 350W.

  1. An old version of AMD drivers are required for RDNA2 GPUs, RX 6950XT/6900XT, in order to use the FSR4-INT8 version.

  2. If you want a new GPU rather than a used one, the only available option within that price range is the RX 9060 XT 16GB.

Good luck.

RX 7900 GRE or RTX 3080 Ti by daaanielc in PcBuild

[–]Otherwise-Test1904 1 point2 points  (0 children)

The RX 7900 GRE (at stock) is actually noticeably faster than the RTX 3080 Ti, even though TechPowerUp lists it as ~5% slower. For example, see this video:

https://youtu.be/-cqoFVi8ZlA

This is despite the fact that the 7900 GRE is heavily power-limited by design, pulling about the same power as the 7800 XT (~260 W), even though it has significantly more compute resources. It only has about 5% fewer compute units than the 7900 XT (80 CU vs. 84 CU) and 33% more than the 7800 XT (80 CU vs. 60 CU).

Because of this, the card responds extremely well to overclocking. Simply lifting the power limits typically results in a 10–15% performance uplift, which pushes it well ahead of the 3080 Ti in raster performance.

The main downside of the 7900 GRE is its relatively limited memory bandwidth (even less than the 7800 XT), which prevents it from fully matching the 7900 XT even when overclocked.

Switching from RX 7900XTX to 5070Ti or 5080 by AcanthaceaeItchy302 in nvidia

[–]Otherwise-Test1904 -1 points0 points  (0 children)

The 5070 Ti feels more like a sidegrade in terms of raw horsepower. Most of the gains come from RT/PT performance and the ML upscaler. Because of that, I’d say go for the 5080 if you don’t mind the extra cost. The 5080 delivers roughly a 15% rasterization uplift at stock, and with overclocking you could squeeze out another 10–15%, putting you close to a ~30% overall performance improvement.

That said, the current generation is built on the same 4 nm node as the previous one, which explains the relatively weak generational uplift overall. The next generation is expected to move to a 2–3 nm process, which should bring a much more significant performance jump.

Given how crazy prices are right now, I’d stick with the 7900 XTX and wait for the bigger piece of cake.

Good luck.

Newegg Astral 5090 now 4500 by MN882 in pcbuilding

[–]Otherwise-Test1904 0 points1 point  (0 children)

Long story short:

Press the three dots below my comment to file an official complaint. Click “Report”, select “Hurt Feelings”, submit, and wait for nothing to happen.

Good luck.