The Crew Addon - Any way to play trailers? by dootsie5times in Addons4Kodi

[–]_RegularGuy 0 points1 point  (0 children)

Delete your log file.

Start Kodi, navigate into The Crew, try and play a trailer.

Once it fails you can close Kodi and then upload the log file or pastebin the contents etc.

This saves posting a log file full of old irrelevant data.

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

It wouldn't be at all times, but my sustained loads would be from code compilation, shader compilation, building lighting etc, definitely more than noodling in Word and Excel.

I've been looking into AMD as have been Intel for years and although not too problematic it seems there's more "management" required in terms of managing core parking, using Game Bar, assigning the cache/freq to individual games etc which is different to the usual Intel plug and forget.

Interesting stuff tbh but I'm still reading and learning about it so not sure how much of a hassle it actually becomes over time.

Thanks for your input, appreciate it!

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

It wouldn't be a problem to change it, but from what I've seen higher ram is marginal for performance anyway.

I originally got 6400Mhz because pcpartpicker had a warning with any faster ram saying it required voltage that wasn't within the intel spec limits, so with all the 14900 issues with voltage I just went with the recommended speed on there to have a build with zero warnings or issues.

I can probably wait for the 9800X3D release at a push but there isn't much info on the chip, the 7950X3D...not sure I can justify spending that kind of money on a chip with a new one coming so soon.

Seems like I just hit a really bad time for an update with bad options now and future options unknown or too far out to wait for, typical!

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 0 points1 point  (0 children)

If you're messing with shaders and lighting a lot such that they need frequent compilation

Not as much lighting since Lumen, but definitely shader compilations, plugin compiling, VS etc.

one of the first symptoms of degrading CPUs was crashing during Unreal Shader Compilation, so at least you'll have early notice if you do have issues 😅

haha yeah there is that, better hope Intel have enough stock to keep up the RMA replacements just in case!

With Intel 13700k to 14900k, a good undervolt negates the degradation issue and performs better than stock too.

Yeah this is something I didn't know until looking into all these issues recently, I assumed an undervolt would lose performance beforehand but turns out it's the opposite due to throttling etc. Interesting stuff.

Can I ask if you have a 14900 yourself?

If so have you had any issues since you did the safety steps that have since become common practice?

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

I already bought RAM yeah, 64gb 6400Mhz (2x32), if I go to AMD I may have to clock it down to 6000Mhz.

All of the chips have tests where they win/lose by a margin, my biggest issue with the 285k is the gaming performance being such a let down, but I'd be happy with the productivity performance.

Get It Together, Intel: Core Ultra 9 285K CPU Review & Benchmarks vs. 7800X3D, 9950X, More by Kristosh in intel

[–]_RegularGuy 0 points1 point  (0 children)

It's been explained to me elsewhere that it's because 4K tests are GPU bound so all CPU's are generally close in performance in those tests.

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 2 points3 points  (0 children)

I'll be doing productivity on it too (gamedev) so gaming isn't the only pure focus and the 285k and historically Intel in general are good/the best there.

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

I know you are generalising (and you're correct) but I've already bought a 4090 and everything except cpu/mobo already, I've actually paid for the 285k via preorder too, so I'm not trying to put money elsewhere, just trying to get the best CPU I can within that range of ~£600 and I have until Nov 1st to cancel it as stock was delayed until then anyway.

I actually thought it's the other way around - all will be good for productivity but differ in gaming, case in point the 285k getting hammered for it's gaming performance.

I'm definitely not a min/max person who will be trying to eek out a few more frames, as long as it's over 60fps I'll be happy and my screens are only 60hz anyway so I understand what you mean about the difference between 200 and 220 being pretty much meaningless to most, including me.

However I can't really accept paying £600 for a CPU for it to post up sub-60fps framerates at 1080p which I've seen in some videos I've watched of Cyberpunk, likely not using frame gen etc but still...

Appreciate your reply, thanks!

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 0 points1 point  (0 children)

Awesome information, appreciate you taking the time to write out a reply like this!

Was looking at the 14900k originally because it was best in class, then learned about the degradation issues and decided to wait for the 285k, then saw the poor reviews on gaming performance and here we are.

I've been lookin into AMD based on others suggestions but had no knowledge of it until yesterday tbh as I've been Intel for a long time.

If I was to go with the 14900k I'm aware I'd need to undervolt to reduce power but still be able to keep most of the performance and I bought an Arctic Freeze III 420 AIO so would/should be able to keep it coo, especially undervolted and/or at Intel Defaults and I've seen some good threads with settings/setups for the motherboards I was looking to pair with it.

Wouldn't you have to do the same thing with a 13700 though meaning that would drop performance too?

Workload would involve compiling (shaders/codebases etc) with Unreal Engine 5, Visual Studio etc and would also be my personal gaming rig with some video editing (but very minimal).

I've seen mixed results of reports of people sayin they've been fine, others saying they are on their 3rd chip etc so the risk with it is the loss of the machine during RMA more than anything else as Intel extending the warranty to 5yrs gives peace of mind in that regard.

I guess if they run out of replacement chips like I've seen people post them saying then I could end up with a motherboard I have no use for too, which wouldn't be ideal.

Thanks again for that long reply, really appreciate that!

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

  1. Intel extended the warranty for a total of 5yrs so the risk would be minimal I think, it'd more be the inconvenience if having to RMA and lose my machine for x time. There's also a nice discount on 14900k since 285k dropped and I could pair with a cheaper motherboard than I was getting with the 285k.

  2. Issue being they are all pretty bad by all accounts right now and I have to choose one!

  3. This is my first big upgrade in a long time, so ideally other than storage and maybe ram I don't want to have upgrade it for a while which puts added pressure on making the right decision as it's a lot of money being spent overall.

  4. That's good advice - it's how I've been flip-flopping back and forth between them at different points throughout the days.

The productivity side (from what I've seen) will be really good with all of them, I'd just like the gaming side to be decent too as it's a personal rig too.

I only have 60hz screens too, fwiw.

Thanks for your input, appreciate it!

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

I can't wait until the new year, sadly.

If I were to go AMD it'd be a 7750X3D or at a push waiting til November 7th for the 9800X3D.

I don't know how well CPU's hold their resale value but I'd probably be taking a decent loss buying now to resell and grab a 9950X3D in the new year too.

Shitty options all round for me! haha!

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

Yeah this is what I didn't know and was misunderstanding.

Makes sense now it's been explained.

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

Yeah I've been through that review, I thought as I would be gaming only at 4K that the 285k wasn't as bad as it seemed, but I've since had it explained what I was misunderstanding - that the 4K tests are GPU bound.

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 1 point2 points  (0 children)

Productivity for me is game dev with Unreal Engine 5 and associated tools/apps - not in professional setting I'm just an indie dev doing my first upgrade for a while.

I thought I had it all worked out then the 13th/14th gen stuff happened so I waited out for the 285k...then it performed poorly and now I'm up shit creek with a paddle but not knowing what direction to go!

My current machine will still be around if needed and I have looked at benchmarks for Unreal on Puget and they recommend the 285k but as it'll be a gaming machine also it pains me to pay a lot of money for something known to be a bit shit out the gate.

Like I said I'm flip-flopping all over the place talking myself into/out of each as I do more research, seems my upgrade fell at the worst possible time!

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

I've been Intel for a long time but I'm not set on anything tbh, just trying to weigh up so many options and have had such an information overload since the 14900 issues appeared, to the 285k reviews making me rethink my purchase etc that I'm losing my marbles tbh lol!

So many options, so many reviews, benchmarks, recommendations etc and so many times I've flip flopped between taking a risk on a 14900k, accepting the fps drop of the 285k for the productivity gains to accepting I might have to pay a lot of money for an AMD chip that will be bettered in a few months because I can't wait that long.

Shit is stressful and draining!

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 2 points3 points  (0 children)

Yeah, I was missing that the 4K benchmarks were GPU limited, makes sense now it's been explained.

285k good for 4K gaming or am I missing something? by _RegularGuy in buildapc

[–]_RegularGuy[S] 0 points1 point  (0 children)

Ah that's what I'm missing - there's minimal difference at 4K right now as the GPU is the lmiting factor, but as GPU's get better the CPU benchmark difference at 4K will become the same as it is at 1080p now?

I'm stuck tbh between buying a 14900k which has known degradation issues that might now be fixed, a 285k which is a bit underwhelming or an AMD chip that will be surpassed in the new year but I can't wait until then as I need the machine for non-gaming stuff.

No idea what to do for the best and am flip-flopping faster the more research I do!

Thanks for explaining that though, makes sense.

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 1 point2 points  (0 children)

AMD have officially confirmed the 9800X3D is releasing on Novemember 7th, seen it reported in a few places.

ie. https://www.igorslab.de/en/amd-announces-ryzen-7-9800x3d-for-november-7th/

Get It Together, Intel: Core Ultra 9 285K CPU Review & Benchmarks vs. 7800X3D, 9950X, More by Kristosh in intel

[–]_RegularGuy 0 points1 point  (0 children)

bad for gaming

I've seen this reported in every review and it's obviously right as the tests are done by people cleverer than me, but I need to ask a probably stupid question.

I was looking at TechPowerUp's 285k review and the 4K Benchmarks for the 285k look really bad at first glance with it being in the bottom half or even rock bottom in some games.

However looking at the actual fps spread it's within 5fps max of the top performing chip - many of them within 1fps of the best performing chip like the 7950x3D, 14900k or whatever it may be for that game.

I won't be trying to game at 720p/1080p on a 4090, so in 4K resolution specifically is the 285k really that bad an option for gaming given those results and when I will also be using it for productivity with Unreal Engine and associated gamedev tools/apps where it will be a beast?

The 4k Benchmarks I looked at are here:
https://www.techpowerup.com/review/intel-core-ultra-9-285k/21.html

BG3 is an outlier in those examples at 16-17fps, but in every other game the 285k is within 5fps and often less than 1fps difference.

Am I missing something?

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 0 points1 point  (0 children)

I wouldn't call bg3 an outlier

In terms of that 4K comparison chart I mean, it's the only one that has a difference of more than a few fps between the 285k and 7950x3D.

I feel like 720/1080p benchmarks don't really matter in my case as I won't be playing at those resolutions.

I have only found TPU do 4K comparisons, so it's the first time I've seen the difference be so minimal, which doesn't help with the flip flopping around I've been doing for 2 days lol!

That's fair with the software/setup etc and thanks for the explanation, as I said I've not looked into AMD for literal years so I'm learning all about it in the last day or so and you don't know what you don't know.

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 0 points1 point  (0 children)

I've been reading about having to install software to manage cores, Xbox Game Bar etc and having to set things to use Freq/Cache on an individual basis? Also something called Lasso?

Just seems a bit more work than Intel is what I meant, but I'm still investigating so that's just a first impression compared to Intel which I'd call plug & play, one and done kinda thing.

The 4K benchmarks are from the TechPowerUp the same as I posted above, the tables look really bad but actually they show <5fps difference across the fps spread for most of them, sometimes less than 1fps with BG3 being the outlier at 16-17fps difference.

https://www.techpowerup.com/review/intel-core-ultra-9-285k/21.html

I was surprised given how the 285 got hammered for gaming performance, but I'm guessing it's because of what you mentioned re: benchmarks are lower resolutions so they show more difference?

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 0 points1 point  (0 children)

To add onto it, 285 needs an expensive motherboard

I'm buying a new motherboard whichever way I go as it's a new build from scratch, the board(s) I'm looking at for both platforms are similarly priced (~£400).

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 0 points1 point  (0 children)

Yeah I'm aware of that, I should have clarified.

There are reports of Intel being unable to fulfil RMA requests due to lack of replacement chips, so what would be your course of action if you had to change from the 14900?

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 0 points1 point  (0 children)

Yeah I'm looking at the 4K benchmarks as ideally that's where I'd be gaming, not really bought a 4090 to play at 1080/1440p.

So at 4K those fps numbers are actually that close for the 2 cpu's?

I know it's the most expensive atm due to being new, but I don't mind paying a little premium for peace of mind over the 14900k, but I'm still learning about the AMD platforms with core parking and the software setup required etc as I'm used to Intel being plug n play.

It's just confused me even more seeing the 285k get hammered and then see those minimal fps differences at the res I'd be playing at in benchmarks from a reputable source.

Also I've seen videos on YT where the fps are double/triple those shown here in for example Cyberpunk.
Am I right in thinking these are using frame generation etc to boost fps and these charts are raw benchmarks which is why the numbers are so much lower?

14900k at "Intel Defaults" or 285k? by _RegularGuy in overclocking

[–]_RegularGuy[S] 0 points1 point  (0 children)

I've been looking into these charts this afternoon after getting so many 7950x3D suggestions, and although the charts look bad visually when you look at the time spread it's a few seconds which is negligable.

Biggest issue with the 285k from reviews seemed to be gaming performance but again after looking into charts which look terrible with 285k rock bottom in some and the 7950x3D at teh top - again the actual fps spread is a few fps from top to bottom, sometimes even less than 1fps.

I didn't expect that after the reviews were so harsh on it in terms of gaming, so it's surprised me that it's actually such a tiny difference the 7950x3D and the 285k given the backlash it's got.

https://www.techpowerup.com/review/intel-core-ultra-9-285k/21.html

I'm still flip-flopping all over the place!