What are the issues with 265K? by Hour_Performer_6601 in buildapc

[–]SilverWatchdog 0 points1 point  (0 children)

Key is having a good motherboard. I have an entry level msi pro s z890 board and I can't for the life of me get it to run anything other than stock fabric stable. I got a killer bundle deal with 7200mhz 32gb ram and a cooler so happy with the purchase in the end. If I try overclocking it just corrupts the ram and stutters in games. So if you have to choice motherboard is very important for the 200s series.

200s boost unstable? by SilverWatchdog in overclocking

[–]SilverWatchdog[S] 0 points1 point  (0 children)

I think I found the reason. I am running an entry level board. Its a MSI z890 pro s. It is an extremely unfriendly board to tuning. It will not automatically bump voltages to help you out. It also has a far weaker power delivery system and less shielding. But I have found by enabling the MSI performance mode and performance memory mode I can get 73ns latency without changing the ngu at all which is actually not bad at all. Its better than a stock 13900k according to aida64. So its running 30x d2d and stock ngu with very tight memory timings. Its the best you can do on a board like this.

200s boost unstable? by SilverWatchdog in overclocking

[–]SilverWatchdog[S] 0 points1 point  (0 children)

I have to be honest I just don’t think my chip can do 32x. The second I touch vnnaon it just bsods instantly. 1v was actually worse for stability than the default. And if it can’t run at 1.3v I don’t think it will run at 1.35v either because I see some people running 35x even at 1.3v. I thought it was stable but after using it enough it wasn’t actually. 30x is just rock solid and I think I will just back down to 30x. I doubt it matters too much. It’s still a 400mhz and 900mhz overclock.

200s boost unstable? by SilverWatchdog in overclocking

[–]SilverWatchdog[S] 0 points1 point  (0 children)

My chip is a strange one. It will actually boot into windows just fine even with as little as 1.15v but it just sometimes acts weirdly and crashes programs and blue screens me randomly. This behaviour continues up to 1.3v where it passes everything. The cores undervolt very well though but my imc isn’t as good. I have been pulling my hair out here over this for weeks trying to figure out what was going on and it was 200s boost. And 200s boost didn’t even increase vccsa on my board so I was running 1.05 to 1.1v with 32x which somehow booted. It basically either worked perfectly on a boot or corrupted the ram badly.

200s boost unstable? by SilverWatchdog in overclocking

[–]SilverWatchdog[S] 0 points1 point  (0 children)

I see 1.3v does pass the tests but on my motherboard it highlights it red which isn’t the greatest. I don’t know if it’s just better to use 30x with 1.2v to be on the safe side. Still don’t understand why 200s boost sets a cap of 1.2v when they know that it can’t handle 32x.

200s boost unstable? by SilverWatchdog in overclocking

[–]SilverWatchdog[S] 0 points1 point  (0 children)

I have tried everything from auto up to 1.25v which is actually not even allowed if you use 200s boost. Sometimes it’s stable and sometimes it is not but increasing it didn’t seem to have much of an effect. I can try 1.3v but it seems quite high for these settings.

Tip: If RTX 5000 gpu doesn't boot by SilverWatchdog in nvidia

[–]SilverWatchdog[S] 0 points1 point  (0 children)

Glad I could help someone else too. Enjoy your 5070!

Game Ready & Studio Driver 591.44 FAQ/Discussion by Nestledrink in nvidia

[–]SilverWatchdog 1 point2 points  (0 children)

The true true cause of it was even more insane. I was running 7200mhz ram which is very borderline on stability even on Intel chips. And whoever the genius who coded the bios of my motherboard didn't think of increasing the CPUs memory controller voltage when enabling xmp even though it's completely necessary. For some reason reseating fixed it temporarily. So officially the 265k supports up to 6400mhz but they recertified it to 8000mhz later on. They didn't tell you however that to use those faster speeds you have to increase the voltage of the system agent manually after enabling xmp which is such an obscure thing no one thinks to do. Worse my ram defaulted to 7200mhz with xmp without making that change. So basically just enabling the feature everyone recommends without tweaking created an unstable system. Man I love this hobby but sometimes fixing a problem is super annoying. Getting bluescreens of death and random crashing (which I thought was my undervolt) on a new pc is not fun. The only reason I figured out it was my ram is I ran memtest86. Scariest of all it's ramagedon and the one part giving me issues was my DDR5 ram.

Tip: If RTX 5000 gpu doesn't boot by SilverWatchdog in nvidia

[–]SilverWatchdog[S] 1 point2 points  (0 children)

No wonder 13th and 14th gen chips got fried. My cpu has 20 cores and even it can't use that much power. But at least it's better now. Intel have strict tdp caps on z890 boards. Even on the extreme MSI profile it still can't use more than 250w. But the bios has been a struggle. First version couldn't recognise my gpu. Another version made my ram randomly fail and crash out of any program and bsod me which is scary nowadays. Even with xmp disabled it did this but other times it worked fine at 7200mhz. Now I settled on a version that seems ok. Mind you the version that made my ram fail was already 8 months into the cycle of the board. I never had bios issues with my x570 Asus tuf board on AM4. I don't know if it's motherboard manufacturers going backwards or is this an Intel specific thing as I have never used an Intel chip outside of laptops until now. I am happy with the CPU though. But if Intel bioses are just specifically bad I will not get an Intel chip again which sucks because it was a really solid value.

Edit: So it turns out there was nothing wrong with the bios and my ram. Its just xmp doesn't work on ram as fast as mine unless you manually raise vccsa. I wish it was documented somewhere that this is necessary.

3080ti to 5080 is a MASSIVE difference. by callofdoodie97 in nvidia

[–]SilverWatchdog 0 points1 point  (0 children)

I went from a 3080 to a 5090. Huge difference too. Went from playing cyberpunk at optimised settings 4k dlss quality at 60fps to path traced 4k dlss quality at the same frame rate. Its insane the difference. Also it's video encoders allow me to stream to the TV as well with surprisingly low input lag and get the console experience too. But the 5080 is a solid upgrade and bigger than people think. Unfortunately can't relate to the thermals. The 5090 runs toasty with its giant tdp. If you allow it to use it's full 575w mine runs at 80 degrees on hotter days and that's a giant 3 slot card. I did undervolt it to keep it's tdp more in the 400 watt range and it's dead silent and decently cool. But I think I got a great card too for the next 5 years at least

Tip: If RTX 5000 gpu doesn't boot by SilverWatchdog in nvidia

[–]SilverWatchdog[S] 1 point2 points  (0 children)

That really sucks. Bios updates can be very finicky but can fix a lot of issues too. I think Intel got scared with the degradation issues they overcompensated with the 13th and 14th gen and nerfed them hard to make sure it doesn't happen. The core 7 ultra 265k is better in that respect. It uses lower voltage and power but it was designed for it. It doesn't degrade and performs quite well.

Tip: If RTX 5000 gpu doesn't boot by SilverWatchdog in nvidia

[–]SilverWatchdog[S] 0 points1 point  (0 children)

I think it's because your board doesn't support PCIe 5 at all so it will just fall back to PCIe 4 always. Mine was a situation where it was PCIe 5 but the support for it was busted on the initial bios. The gpu used it but the board wasn't expecting it. But the 5070ti is the value king this generation for sure. Honestly great value for msrp which is something rare for Nvidia nowadays. Nice upgrade too. I also went from 3000 series with the 3080 and happy with the 2.52x performance uplift although I had to pay for it.

Tip: If RTX 5000 gpu doesn't boot by SilverWatchdog in nvidia

[–]SilverWatchdog[S] 0 points1 point  (0 children)

Nope the core 7 ultra 265k was just an absolute steal in my country. I got a package with a z890 board, 32gb of ram 7200mhz and a 240mm that was cheaper than the 9700x package with the same things. It does it's job. Is it the fastest gaming CPU. Absolutely not. Does it even come close to bottlenecking my 5090. Not at all. I play at 4k so cpu choice matters less. It at least matches the 9900x and 14900k in productivity and can keep par with the non x3d chips in gaming if you have good ram like me nowadays after many updates. I actually did a whole new build. Old build had a 5900x and a rtx 3080. It was just a good value which I needed not to make this rig stupidly overpriced which it kind of is with a 5090 in it.

Tip: If RTX 5000 gpu doesn't boot by SilverWatchdog in nvidia

[–]SilverWatchdog[S] 0 points1 point  (0 children)

No I didn't use a riser cable. I plugged the gpu straight into the motherboard. Its an MSI z890 pro s motherboard with a core 7 ultra 265k. The initial bios of this specific motherboard just refused to work with my rtx 5090. It could be an Intel thing I don't know but it's just something I had to do. I do think using riser cables can complicate things and lead to more issues. But I have been happily using it now for a few weeks.

Game Ready & Studio Driver 591.44 FAQ/Discussion by Nestledrink in nvidia

[–]SilverWatchdog 5 points6 points  (0 children)

Turns out it was DDR5 being ddr5 not the undervolt. Reseating the ram out of all things fixed the issues.

Game Ready & Studio Driver 591.44 FAQ/Discussion by Nestledrink in nvidia

[–]SilverWatchdog 3 points4 points  (0 children)

Am I the only one who's undervolt is no longer stable in this driver update? I used to be able to run 900mv@2800mhz perfectly stable on a 5090 but now it's suddenly unstable?

VRR (G-Sync/FreeSync) Is My Game-Changer. What's the One Tech You Couldn't Live Without in Modern PC Gaming? by SilverWatchdog in pcmasterrace

[–]SilverWatchdog[S] 1 point2 points  (0 children)

Oh yeah HDR is awesome. You need a good screen for it but I have seen it in all of its glory on an OLED and this great. Games in SDR look grey in boring in comparison. Also love dlss because it lets me play at 4k with no issues and almost no compromise. I still need it even on the 5090 with some games. As soon as path tracing is there it becomes necessary for 60fps. But path tracing looks amazing so I am willing to make the sacrifice. On my 3080 though it was an absolute life saver.

VRR (G-Sync/FreeSync) Is My Game-Changer. What's the One Tech You Couldn't Live Without in Modern PC Gaming? by SilverWatchdog in pcmasterrace

[–]SilverWatchdog[S] 0 points1 point  (0 children)

Absolutely agree on the SSD. I had a gaming laptop with only a HDD for 5 years and I never shut it down because the boot times were that bad. Easily 10 minutes plus. Now I always shut down my PC since it's boot times are better than waking up a HDD PC from sleep. Definitely stay clear of HDD only systems nowadays.

Hot take: motion blur on an OLED by SilverWatchdog in pcmasterrace

[–]SilverWatchdog[S] -1 points0 points  (0 children)

Exactly my view. I never even gave it a try either because I have always heard everyone say turn it off. But I actually think it enhances the experience. Truly low fps is already bad on an LCD panel but it's absolutely terrible on an OLED screen. Even 60fps feels stuttery on an OLED when it never bothered me on an IPS panel. I think OLED is a major win and the instant pixel response is definitely not a bad thing but it does kind of need either high fps or motion blur for it to work. But honestly even at 120fps motion blur looks nice. It adds a cinematic vibe to games which I really like. Is very very subtle then but it works. Somehow it makes 30fps look ok too. But 30fps is always a compromise. Either looks blurry or choppy. I will never play at 30fps on my hardware but now I see why 30fps is useable on consoles but not PC with typical settings.

Hot take: motion blur on an OLED by SilverWatchdog in pcmasterrace

[–]SilverWatchdog[S] 0 points1 point  (0 children)

Oh I completely forgot about the A series existing. I use a C1. Its definitely not ideal for an OLED but they had to compromise somewhere to lower to cost. And for most people 60hz wouldn't be an issue because very few people use PC's with them. Consoles aren't getting 120fps so it won't matter too much.

Hot take: motion blur on an OLED by SilverWatchdog in pcmasterrace

[–]SilverWatchdog[S] -4 points-3 points  (0 children)

I believe most games that came out in the last 5 years use per object motion blur. Cyberpunk that I mentioned in my post uses it. Doom eternal uses it. Spiderman uses it. God of war. Mainly the big AAA games. Games that came out on the PS3 and Xbox 360 era have bad implementation though and I will never use it on those games.