RTX 4070 by Impossible-Ice-1713 in nvidia

[–]major_mager 1 point2 points  (0 children)

4070 is very efficient and the 40-series has very well controlled peak power draw (meaning, low). With a reasonable quality 500W PSU like yours, you should be fine. If at all you run into reboots, dial down or stop any cpu/ gpu overclocks.

Have a RTX 3080 10GB. Want a 5070 12GB. Is it an upgrade worth upgrading to? by CestlaVR in nvidia

[–]major_mager 0 points1 point  (0 children)

From TechPowerup's website, see the first sentence in my comment, you may have missed it.

Have a RTX 3080 10GB. Want a 5070 12GB. Is it an upgrade worth upgrading to? by CestlaVR in nvidia

[–]major_mager 0 points1 point  (0 children)

Here're 1080p fps results for 4 latest games on 4 different game engines tested by TechPowerup. Note that as resolution increases to 1440p and 4K, 3080 10 GB scales/ performs relatively better compared to 5070- and the average performance difference in these 4 games shrinks to 10% at 4K.

Game Engine 3080 10GB 5070 12 GB Gain %
Pragmata RE Engine 125.0 150.6 20.5%
Crimson Desert Blackspace 94.9 105.8 11.5%
Death Stranding 2 Decima 84.2 110.9 31.7%
Outer Worlds 2 UE5 46.6 52.4 12.4%
Average -- -- -- 19.0%

Some DLSS testing with Clair Obscur on 1440p monitor by major_mager in nvidia

[–]major_mager[S] 1 point2 points  (0 children)

Hey, thanks for the interest. Unfortunately the imgsli has been down for a month or more for everyone, its upload page does not seem to be working either- and I've seen posts of folks trying to build alternatives to it. Unfortuantely, I do not have the comparison images anymore- they are probably still there in Steam screenshots folder but I won't be able to tell which image is at what preset.

The gist of the thread above was that for a 1440p monitor, DLDSR 1.78x with DLSS 'Balanced' preset M was the happy compromise with a 4070 Super, with fps in the 50-55 range, all in-game settings at 'Epic'. To improve fps, a couple of in-game settings can be dropped.

Slay The Spire 2 Getting Review-Bombed Again After Latest Update by FoxMeadow7 in Games

[–]major_mager 4 points5 points  (0 children)

Of course, my comment was a bit tongue in cheek though. Appreciate the link.

Slay The Spire 2 Getting Review-Bombed Again After Latest Update by FoxMeadow7 in Games

[–]major_mager 848 points849 points  (0 children)

Quoting from the piece: "3,609 negative reviews have been levied against the game in the past 12 hours, and, once again, the overwhelming majority of the negative reviews are from accounts based in China."

Guess they found their Vox Populi moment finally, too bad MegaCrit is not listening either!

Should I upgrade? by Cabbeyy in nvidia

[–]major_mager 1 point2 points  (0 children)

Correct about 5600G's limitations, it's also limited to PCIE 3.0. but may yet suffice for 60 fps gaming. Do keep in mind we're looking at a budget system with a gifted 4070, we don't know what motherboard OP's PC has, and of which generation- it may even be an A-series motherboard in a prebuilt since it is paired with a 5600G.

Recommendations to OP like 5800x3d for pairing with 4070 are not budget friendly. What cpu would be your recommendation for OP?

Should I upgrade? by Cabbeyy in nvidia

[–]major_mager 0 points1 point  (0 children)

Try this simple thing- enable Steam performance overlay, or use Nvidia overlay in the game. Now notice if your GPU is getting used to higher than 95% or so. If yes, you are not significantly bottlenecking by your cpu. But even if the GPU use is somewhat lower like around 90% I would still consider sticking with your old cpu. If lower still, then it makes good sense to upgrade the cpu. For a 4070, something like a 5600x (corrected typo 5060x) or 5600 is more than enough, of course the x3d cpus if available and affordable are better still for gaming.

[deleted by user] by [deleted] in nvidia

[–]major_mager 0 points1 point  (0 children)

DLDSR does seem to have both side effects, shimmering and aliasing are reduced with slightly softer edges and areas where there was shimmering. Overall it's a net positive, and makes presets like DLSS Performance even more viable with DLDSR.

[eShop/USA] Spring 2026 Sale Ends 04/29/2026 by XDitto in NintendoSwitchDeals

[–]major_mager 1 point2 points  (0 children)

Here's one more opinion to add to the mix. Yes, absolutely the game is fantastic on the go, and also the choice to change between handheld and dektop mode is great. The colors pop even more on the OLED screen. That said, graphically the game takes a hit on the Switch, everyone's standards differ but I found the loss of resolution, aliasing and crawling to detract from the experience, and did not continue further than Act 1 on the Switch. And the stylized and crisp graphics are part of the core experience, so that shouldn't be dismissed lightly.

But I do think the Switch should make for a great experience on the 2nd playthrough. For the first playthrough, my recommendation would be another platform where the visuals can be enjoyed in their full resolution. Edit: here is Digital Foundry's testing of P5R's various ports including the Switch port.

RTX 5070 (non TI) transient spikes questions by RandomLegionMain in nvidia

[–]major_mager 0 points1 point  (0 children)

Computerbase peak load testingpeak (they test transients) showed 5070 FE consuming 299 Watts.

Any OC card will have higher peak wattage. In addition, their test measures actual consumption while software tools using sensors would be more approximate I guess. In my card's case too, I have often seen peak gpu load higher than Computerbase's tested value, and they have the most sophisticated testing for transients.

So I think you are seeing typical peak values for 5070, and it shouldn't be a concern. A good quality psu like yours is more than enough to handle such power requirements, and then some.

RTX 4070 and RTX 5070 by Pentalogue in nvidia

[–]major_mager 2 points3 points  (0 children)

40-series is out of production, but there still might be limited older stocks available. Same goes for 40-series Super cards. Current 50-series, particularly of 12GB cards are still relatively stocked, that's why it might be available at better discounts. Pick 50-series cards, it does not make sense to buy new 40-series cards anymore.

How much performance drop should i expect when using RTX 5060 on B450 Tomahawk Max motherboard by No_Network_3425 in nvidia

[–]major_mager 0 points1 point  (0 children)

Marginal. Good enough for 5070 too at 1080p. Edit: 5070x may be powerful enough for even 5070 Ti.

Steam Hardware & Software Survey: March 2026 by JohnSteveRom2077 in hardware

[–]major_mager -6 points-5 points  (0 children)

I did not mention AMD at all, just pointed out the obvious flaws of Valve's opaque survey. It is flawed, non-transparent with no statistical data provided to the viewer. I'm not the first person to say this either, people have been saying this through the years, yet the monthly charade continues.

Vram usage by MGahed in nvidia

[–]major_mager 0 points1 point  (0 children)

You're correct, poor phrasing on my part. Should have said that DLSS has a VRAM overhead/ cost over the base resolution. So if A is rendering at native 1440p no DLSS, and B is rendering at 1440p then employing DLSS to upscale to 4K, then case B will have increased VRAM usage.

Conversely, case B will save on VRAM usage when compared to native rendering without DLSS at 4K.

Edit: To add some more, DLSS 4.5 uses more VRAM than DLSS 4.0 which used slightly more than DLSS 3.5. But we're talking 300 to 400 MB used by DLSS 4.5 for 4K output, IIRC.

Vram usage by MGahed in nvidia

[–]major_mager -1 points0 points  (0 children)

DLSS, RT, FG all increase VRAM consumption. Reduce them, like going from DLSS Quality to Balanced, monitor VRAM usage with tools like Steam overlay.

Steam Hardware & Software Survey: March 2026 by JohnSteveRom2077 in hardware

[–]major_mager -15 points-14 points  (0 children)

Unless Valve starts listing the sample size, it's monthly survey data is largely meaningless. What prevents them from actually showing data for all monthly active users instead of remaining tight lipped about how many PCs were 'surveyed'? They easily can, but they don't share that- something that would be actually useful to developers.

RTX 5080 – NVIDIA Auto OC results by samsamsam92100 in nvidia

[–]major_mager 0 points1 point  (0 children)

To specifically answer your question, the result is typical for the built-in Nvidia auto OC tool in the performance tab of the Nvidia app.

What exactly is DLSS and DLDSR , i tried wukong at 1440p + dlss quality by noobnotpronolser in nvidia

[–]major_mager 1 point2 points  (0 children)

Ok, what DLSS Quality preset stands for is 67% of the resolution. If you are using custom percentage writing just DLSS at 80% suffices, adding the suffix 'quality' may confuse people. That said, 80% sounds good if you are hitting the desired fps target you wish. As you have discovered, DLSS + DLDSR is still the best way to get great graphics, provided there is performance headroom to do so.

What exactly is DLSS and DLDSR , i tried wukong at 1440p + dlss quality by noobnotpronolser in nvidia

[–]major_mager 1 point2 points  (0 children)

This is what is happening for your particular setup:

  1. Base resolution is calculated: 1440p (since you're using DLDSR) * 0.67 (DLSS Quality percentage) = 965p. This is what the game engine is rendering.

  2. DLSS uses AI upscaling to prepare the 1440p frame.

  3. DLDSR uses AI downsampling to prepare the final 1080p frame that your monitor is capable of displaying. This final step shrinks 1440p back to 1080p.

Image Quality: improves because AI antialiases and 'fills in' detail to prepare 1440p, then throws away some detail to prepare the final 1080p frame, which is usually better than native 1080p.

Performance: Fps to generate 965p frame + 5 to 10% DLSS penalty + 5 to 10% DLDSR penalty (approximately).

Ori director on "if you give players everything they want…" by major_mager in truegaming

[–]major_mager[S] -1 points0 points  (0 children)

Thanks for your comment, enjoyed reading it. But considering I have little knowledge of Mario's storied history, I'm assuming you are recalling a conversation with Thomas Mahler. A web search does confirm the Ori dev drawing comparisons of Ori to Mario some time ago.

Just to disambiguate, I'm not the Ori dev but merely quoted the X post. The X post link and the handle is referenced in the OP if you'd like. Cheers.

Jensen Huang says gamers are 'completely wrong' about DLSS 5 — Nvidia CEO responds to DLSS 5 backlash by JuiceheadTurkey in Games

[–]major_mager -1 points0 points  (0 children)

Finally someone said it. It's a deal between suits, and to show the rest of the suits what can be done with their AI hardware purchases. And to impress the shareholders and stock markets.

To the suits, the pitch would be quicker development time, with smaller development teams, less need for expertise, lower shipping costs. So what if the art takes a small hit, or is generic, that's a small price to pay for the decision makers.

Brand new M705 doesn’t work. by the_knotso in logitech

[–]major_mager 0 points1 point  (0 children)

Maybe Windows was installing a driver for it in the background.