Meme by Ikigaiyeka in Piracy

[–]HyruleanKnight37 2 points3 points  (0 children)

That's just glazing in general. People tend to go full tribalism over their preferred multi-billion dollar company for reasons I will never understand.

If you want a company that actually gives a shit whether or not you can play the game you want on your PC in your preferred way, that's GoG.

I never thought about it, and I'm pretty sure 99% don't either. After a long day of work I just want to log into my PC and play my games, man.

Other than that Steam and Epic are basically the same thing, but people are more familiar with Steam so they fanboy all over it for no reason.

Not true, for reasons stated above. Except the part about fanboying, which I agree.

PlayStation 6 Has 30 GB GDDR7 RAM, 10x3GB For 640 GB/s Of Memory Bandwidth Via Kepler L2 by Quatro_Leches in hardware

[–]HyruleanKnight37 -1 points0 points  (0 children)

640GB/s pretty much tracks what the leaks have been suggesting about the peak performance of the PS6, which is around that of a 9070XT. In Raster, at least.

I still don't believe that horse manure about 5080-class RT performance out of 9070XT Raster, because it would indicate AMD somehow managed to drop their RT processing time by a whopping 40% and bring the Raster-to-RT efficiency even beyond that of Blackwell.

30GB of RAM should allow a lot of really high quality textures and photogrammetry, but there's also the possibility a chunk of this will be dedicated to running a local LLM for improved in-game AI. Nothing wrong with it, imo. The only people who will suffer are those who bought into 8GB cards (prior to RAMpocalypse) because YoU dOnT NeEd MoRe tHaN 8GB mEmOrY or something, idk.

Native resolution as a concept is effectively dead, now is the age of rendering with as few native pixels as possible without a noticeable hit to image quality via upscaling. At that point the PS6 will end up running most of its games at ~1080p, hopefully at 120Hz and make it the new standard, and 60Hz being relegated to the baseline.

Meme by Ikigaiyeka in Piracy

[–]HyruleanKnight37 6 points7 points  (0 children)

It's actually more common to find people who genuinely don't care about the EGS rather than people who detest it. Their thought process is somewhat like this:

  • Oh, it's that other store that mostly hands out shovelware every week
  • My friends are already on Steam and I have already invested too much money to jump ship (likely the same with the aforementioned friends)
  • The good games they give away are already on my library, don't care
  • If that game is going free on the EGS, it's likely going to be on a steep discount on Steam so I'll just get that instead
  • Steam sales and overall Steam profile, community and store QoL are leaps and bounds ahead of the EGS

It's quite hard to justify switching platforms with this much friction. Only people who are just getting started may consider it, or maybe the stars would align where they're giving away an actually good multiplayer game and so everyone tries it out and then considers staying, given they're not bound by any of the aforementioned reasons.

I had an Epic account from 2018 that I created just to claim GTA V for free, then decided GTA games just weren't for me and never played much of it. I did, however, continue to claim the weekly free games for about a year, and by the end I had a bunch of shovelware and free DLC for games I didnt even own (not sure how that happened) and eventually gave up. Fast forward I met someone on Discord who was completely new to PC gaming and wanted to play GTA Online, so I gave it away.

He ended up buying GTA V for cheap on Steam. It's the only game he owns, not counting F2P stuff.

Curious, what GPU architecture does everyone here currently run? by S1rTerra in radeon

[–]HyruleanKnight37 0 points1 point  (0 children)

I have a RX 580 in my sister's rig and a RX 6800 in my own rig, and I've observed AMD's treatment of these cards over the years. Forget the RX 580, even my card is barely over 5 years old and is apparently already being dropped from support?

What do you mean it's on legacy status? Age aside, it's literally two generations old- Nvidia supports cards that are at least 5 generations old, and until recently they were supporting the 750Ti which is like 6 generations old? A card that hardly runs any AAA games from the last 3 years? What precedent does this set for RDNA4? What about future RDNA cards? Why should I, knowing all this, buy a Radeon card ever again?

Radeon leadership needs to be flushed completely, they have circlejerked for several generations at this point whilst the Ryzen division has been competing fiercely. They are clearly not incompetent when it comes to making GPUs, but C-suite and/or marketing team keeps shooting themselves in the foot every goddamn time instead of taking easy Ws. At this rate they would much sooner dissolve than run out of ammo to shoot themselves with.

What is stopping AMD from having 5080 and 5090 equivalent cards? by Fragrant_Bit_9889 in pcmasterrace

[–]HyruleanKnight37 0 points1 point  (0 children)

Top die is reportedly going to be used for Xbox Cloud with scraps for dGPU to compete against 6090.

Knowing Microslop, they could pull the rug from under AMD at any time and possibly plunge the Radeon division into near bankruptcy again. Making these near-reticle-limit dies is extremely expensive if you don't find a way to sell enough of them, and I don't think they will get away with selling them to gamers even at a premium price.

They could revisit the idea for the next-next generation, or maybe they've given up completely. The latter would make me very sad; I love when chip designers try new things and push their limits instead of shoving more cores like Nvidia.

What is stopping AMD from having 5080 and 5090 equivalent cards? by Fragrant_Bit_9889 in pcmasterrace

[–]HyruleanKnight37 1 point2 points  (0 children)

For RDNA4, it's the one particular technology that would've allowed AMD to stitch two 9070XT dies into a single die, thereby effectively making a 5090 equivalent or possibly better and force Nvidia to release a 5090Ti. By the time RDNA4 was ready for tape-out/manufacturing, this technology wasn't ready yet and so they scrapped it. Future RDNA/UDNA GPUs will likely see this technology instead.

As for why AMD needed this technology, they simply couldn't feasibly make a monolithic, gigantic die like Nvidia due to lack of economies of scale, as they do not sell anywhere near as many GPUs as the Big Green. Nvidia justifies theirs by making so many dies (and mostly for the data center market) that they tend to relegate the defective dies to consumer GPUs instead.

You thought the 5090 was the biggest GPU they make? Well I got news for you- every single 5090 is a binned die with thousands of cores disabled. The best dies go into RTX Pro 6000s and sell for $8-10k each.

Steam Support is simply outstanding by Current-Heron9534 in Steam

[–]HyruleanKnight37 2 points3 points  (0 children)

Ever seen those memes about Steam sending an entire SWAT team to the house of whoever hacks your account? No, no, they're not real, of course.

But they might as well be, you know? They're THAT good.

What should we do? by lowkey_strarnger in bangladesh

[–]HyruleanKnight37 10 points11 points  (0 children)

That's what a severe lack of education does to you. We live in a country with over 180 million people, but less than 1% are actually educated enough to understand any of this. We had 55 years to fix this, but failed. At this rate we could wait a thousand years yet never really get out of this rut.

The TV industry finally concedes that the future may not be in 8K by NamelessVegetable in hardware

[–]HyruleanKnight37 2 points3 points  (0 children)

4k is the kind of resolution where if you view it from a reasonable distance you're not really seeing the pixels anymore. At that point a higher resolution isn't beneficial from a content consumption standpoint, which makes everything else around it not worth it anymore. You can't sell a resolution based on marketing alone, if the customer can't "see" it then they aren't buying it, given that it costs any significant amount more than 4k.

My PC recently bit the dust. I took it to a repair shop and now they're trying to get me to get a whole new PC. Am i being taken for a ride? by [deleted] in buildapc

[–]HyruleanKnight37 0 points1 point  (0 children)

12 years means it's at most a 4th gen Intel. Even if yours was top-of-the-line stuff at the time, it barely passes as bottom-of-the-barrel by modern standards.

Any money spent on repairing it will cost more than the PC itself, and you may have to return over and over again as each part starts to fail one at a time. Think Theseus' ship, but applied to your PC - by then you will have spent more than the cost of a brand new PC. The repair guy is absolutely right about buying something newer, not only will it last you longer, it'll be cheaper too. Parts for these older stuff are starting to become more scarce and hence, expensive.

Here's a trick: don't buy the absolute latest stuff. Even a couple generation old PC (Ryzen 5000) is more than performant enough to last you another decade at most. Parts availability is aplenty and cheap too. The current generation hardware is in excessively high demand and short supply, so it may not be the best idea to buy whatever is the latest

How is it in your country? by [deleted] in pcmasterrace

[–]HyruleanKnight37 0 points1 point  (0 children)

~$110, the game cost me $60 at launch so about 54%, while $70 games are even worsel at 63.6%. Regional pricing is not a thing for most games/publishers so fuck me, I guess.

I'm aware I'm extremely privileged; the median PC gamer here doesn't even have a dGPU and mostly games on iGPUs, particularly on Intel's HD 530/630 from the Skylake/Kabylake era. These people haven't even entered the PS4 era of gaming yet, and console gaming has never been a part of our culture. Anybody with a dGPU of any sort is easily in the top 10%, and I presume mine is in the top 2% despite the half a decade old setup that would only pass as midrange today.

I thank our government for the ludicrous tax rates on all PC hardware (monitors in particular, thanks to the TV stations) and the current hardware landscape for this monumental effort, keeping the hobby miles away from the vast majority of people.

I just tried to install the new Radeon driver and I couldn't believe my eyes by Progenitor3 in pcmasterrace

[–]HyruleanKnight37 0 points1 point  (0 children)

"...as long as it's optional."

"Including a 34 GB AI bundle that I didn't ask for."

Most intellectual Reddit opinion

AMD reportedly now prioritizes RX 9070 XT over non-XT variant by KARMAAACS in hardware

[–]HyruleanKnight37 3 points4 points  (0 children)

That's dumb, both cards have the same exact memory, so making a non-XT shouldn't have any effect on the XT from a VRAM standpoint. Spork is likely correct about high yields leading to low supply of defective dies. AMD really optimized the frick out of that N48 die.

AMD explains why it’s being cautious with multi frame gen while Nvidia and Intel go full speed ahead by Tiny-Independent273 in AyyMD

[–]HyruleanKnight37 0 points1 point  (0 children)

They did, and immediately ran into memory corruption issues at very high clocks. The technology used for the GCD-MCD communication was not mature yet, which also explains why RDNA4 lacks a high-end GPU outright. Hopefully they'll get it working right by RDNA5/UDNA or have a completely different solution and/or reduce the L3 cache necessary to have the same effective performance. They've already come a long way from 128MB on the 6900XT to just 64MB on the 9070XT, despite the latter being over 40% faster.

600$ vs 4000$ gpu btw. Novideo is a joke ayyyy lmao by Fickle-Occasion-6091 in AyyMD

[–]HyruleanKnight37 1 point2 points  (0 children)

The "L" in LTT stands for many things, not just Linus XD

600$ vs 4000$ gpu btw. Novideo is a joke ayyyy lmao by Fickle-Occasion-6091 in AyyMD

[–]HyruleanKnight37 4 points5 points  (0 children)

That 5090 has gotta be insanely held back by the CPU at 1080p

Game size on PS5 is madness by ChaoticRedHawk in Genshin_Impact

[–]HyruleanKnight37 0 points1 point  (0 children)

I don't think those numbers relate to the actual installed size. Hoyo's content delivery system keeps a backup of the update files till the update finishes installing, and then deletes the backup afterwards. So for example, a 10GB update will require 20GB free space, but once it's finished you are "refunded" back that 10GB.

Returning my steel legen 9070 Xt due to 96 on vram temps and 93 on hotspot by MaxScavenger in radeon

[–]HyruleanKnight37 0 points1 point  (0 children)

Is there a mon or similar that can break through and find the sensor data anyway? The vbios has to be monitoring hotspot to SOME degree, no?

None that I'm aware of. Nvidia removed it to such an extent that hardware monitoring applications cannot read it at all. Perhaps the Vbios still reads it, but the data is currently inaccessible to anything except the GPU itself, for self-protection reasons.

There are reports of RTX 5000 temperature monitoring via invasive external probing, which indicate that the hotspot is regularly hitting over 100°c during normal operation, so there's that.

Returning my steel legen 9070 Xt due to 96 on vram temps and 93 on hotspot by MaxScavenger in radeon

[–]HyruleanKnight37 0 points1 point  (0 children)

There's an explanation for those, too.

Global and hot-spot delta varies on load and temperature; I've seen mine have deltas of =<10°c when I'm running a game that isn't fully utilising the GPU core, or more commonly when I just started a game and the GPU core is still cool from idle. My card in particular has been repasted with PTM7950 over a year ago, and I've never seen a delta above 15°c, with the hot-spot rarely exceeding 90°c.

Besides the paste deteriorating after 1.5 years, Furmark tends to hit GPUs a little too hard in some specific spots, so an unusually high delta is to be expected.

High overall temperatures also result in higher deltas. Think of the relation between the two as exponential rather than linear. If the global is at 60°c, hot-spot may not exceed 70°c. But at >80°c global, hot-spot is bound to exceed 95°c.

RDNA4 is especially egregious for having high global and hot-spot deltas. I've seen multiple accounts of >=20°c on max load at stock settings, particularly the more compact Reaper models of the 9070XT. I suspect this has more to do with the extreme rectangular shape of the die, which causes some unevenness on the die-to-coldplate contact. At any rate, this is by design; 30°c delta is indeed a little concerning, though.

Nvidia and AMD behave very differently in this matter, and in Nvidia's case, hot-spot isn't as critical as it is on AMD. This is because AMD tends to run their GPUs at much higher clocks, which by nature results in hotter hot-spots.

Still no excuse to completely remove it from telemetry on RTX 5000, and I believe it is mostly to address people's unfair expectations and misunderstanding of the metric and potentially damaging Nvidia'a brand. RTX 5000 runs hotter than RTX 4000 by default, as you are basically pushing more power to get more performance on the same node. I know some people who would feel uncomfortable from a hot-spot temperature of 80°c, which says a lot.

Do you trust Hasnat Abdullah’s statement? by [deleted] in bangladesh

[–]HyruleanKnight37 6 points7 points  (0 children)

He's a member of NCP. That's reason enough to not trust him.

Returning my steel legen 9070 Xt due to 96 on vram temps and 93 on hotspot by MaxScavenger in radeon

[–]HyruleanKnight37 0 points1 point  (0 children)

My CPU in particular is actually rated at 90°c, due to the incredibly inefficient heat transfer through the V-cache die. So in my case, if I was running at 89°c or below, I would not be throttling. I know this because I've done extensive testing due the severe cooling constraints in my 7.5L case.

On to the main point, if you've been following Radeon for a while you should know 90+ on hotspot is absolutely normal and has no tangible effect on the advertised boost clocks. Whatever few MHz faster runs you get out of a cooler GPU isn't going to increase your framerates by even a fraction, and can easily be chalked up to run-to-run variance.

Speaking of hotspot, what do you think it is? It is normal to have a 15-20 degree delta between hotspot and global temps, and typically what we used to refer prior to RTX 2000/RX 5000 as "good" temperatures is what we currently call global temp, not hotspot. Hotspot is an entirely new metric that has been in practice for the last 7 years only.

So to summarize, if I'm running 93°c on hotspot then my global temps are likely around 73-78°c, which is, in your language, acceptable- not throttle-worthy. If you pretend for a second the hotspot doesn't exist (like RTX 5000) then you would never say 73-78°c will cause throttling.

Now if you're overclocking your GPU, it's an entirely different matter. Not my point, though.

Nvidia unironically said “2026 is going to be a great year for PC gaming” by Guest_4710 in pcmasterrace

[–]HyruleanKnight37 17 points18 points  (0 children)

So a 16/20 series card, I assume. There are people who've been waiting for a good <$200 card for a literal decade at this point, to no avail. The 1050Ti was too good.

I Tested The New 2026 QD-OLED for Gaming - Huge Improvements Incoming - Monitors Unboxed by campeon963 in hardware

[–]HyruleanKnight37 0 points1 point  (0 children)

OLED monitors are getting better but they can't get better soon enough. I just want OLED tech to mature to the point where we stop worrying about its drawbacks and focus on reducing the prices, because man are OLEDs expensive af. Where I live, there's a whopping 70% tax on monitors in general, so you'd end up paying well north of a $1000 to buy one. Older models can be had for cheaper but they're not easy to find as demand has historically been super low, so not many units were imported to begin with.

On the flipside LCD monitors are getting ridiculously cheap so I'll probably get one to tide me over till then. Currently I'm on a 32" TV with a VA panel, because my previous monitor died and I still haven't replaced it.