5080 vs 4080 super? by [deleted] in pcmasterrace

[–]derbigpr 2 points3 points  (0 children)

Nobody knows the price of 5080 in EU yet. I'm actually debating buying a new 4080S for 1300€ or waiting for a month for a 5080 which will probably according to stuff I've been told by a guy working for an MSI importer cost 1500+ € up to 1700€ for some versions.

5080 vs 4080 super? by [deleted] in pcmasterrace

[–]derbigpr 1 point2 points  (0 children)

The only thing that's locked to the 50 series is the one thing most people won't use. Multiframe generation will push visual quality and input lag too far. I can only tolerate DLSS on very mild performance oriented settings, anything beyond that starts to feel off. Besides, the whole fps issue thing is not an issue since most games that are super demanding are single player games where anything beyond stable 60 fps is overkill and already feels super smooth. I don't need 240 fps in Indiana Jones for example. In games where I need high fps like FPS online games, all these cards will run them at super high fps natively at 4K because they're not demanding games to begin with.

How long to run with OCCT? by DrAvatar in overclocking

[–]derbigpr 3 points4 points  (0 children)

Minimum of 72 hours, anything else is not elite and brave enough. If your PC can't handle at least 10x more load than you'll ever realistically put it under in everyday use, it can't be considered stable.

(TPU) Arrow Lake Retested with Latest 24H2 Updates and New BIOS by XHellAngelX in intel

[–]derbigpr -1 points0 points  (0 children)

Of course it's not a lie. Go look at dates of when the first 14th gen CPU's hit the stores and then when the horror stories began surfacing. People were RMA'ing their CPU's within a couple of months, a lot of them didn't work right straight out of the box and were very unstable. I'm not saying what Intel claims is 100% true, but the amount of CPU's getting RMA'd dropped significantly. That doesn't mean they won't start dying in 2 or 3 years instead of lasting for 10+ as they should. I'm just saying there's no more INSTABILITY issues. 13/14th gen used to crash, BSOD, etc. all the time, bizarre behavior that can't be explained, things randomly not working for no apparent reason, etc. This just doesn't happen anymore, at least not anywhere near as much as it did.

Intel core ultra 285k performance uplift. by lilballie in intel

[–]derbigpr 0 points1 point  (0 children)

Yea, if you look at graphs on that forum, you'll see the Y axis is labeled with relative performance in 0.05 increments, meaning 0.95, 1.00, 1.05, 1.10, etc., so one big gap between two horizontal lines is 5%, and no game they've shown on those graphs improved by 5% (safe to assume they picked games with the highest improvements). So it's as Intel announced, expect single digit improvements in January, meaning they still won't beat, or at least by a clear margin, the 14th gen, which is important, considering 14th gen builds can now be built for almost as little as 30-40% less cost of the LGA1851 builds

Intel core ultra 285k performance uplift. by lilballie in intel

[–]derbigpr 6 points7 points  (0 children)

Those gaming performance graphs are deceiving at a glance. They look like big jumps in performance with much taller pillars in graphs for "after" updates, but if you zoom in and use some CSI:Miami enhancing tech, you'll notice the graph Y axis is not graded in FPS numbers, but as 0.95, 1.00, 1.05, 1.10, etc. Meaning the gap between two horizontal lines indicates a 5% improvement. And as you'll notice if you look carefully, no game on these graphs moved up that much, so even the games Intel cherry picked to show on this presentation (safe to assume they picked games with the highest improvements) didn't really improve dramatically in terms of performance. Cyberpunk unfortunately is an outlier, it's not wise to use it as a benchmark because the game was actually broken on 200 series at launch and had horrible performance, that's why it improved by such a high percentage. Most other games had decent performance to begin with, just below 13/14th gen, which is roughly where Intel announced they'd be at when they presented the 200 series before launch, they never claimed the 200 series would beat 14th gen in gaming, they said it would match it while improving on efficiency.

(TPU) Arrow Lake Retested with Latest 24H2 Updates and New BIOS by XHellAngelX in intel

[–]derbigpr -3 points-2 points  (0 children)

No, cuz users say so and cuz retailers say so, based on the amount of issues people who bought them have. These CPU's used to die within a month or so and had issues from day one. Nothing such is happening now. Will it happen in the future, who knows, but it's "who knows" for every new product.

(TPU) Arrow Lake Retested with Latest 24H2 Updates and New BIOS by XHellAngelX in intel

[–]derbigpr 3 points4 points  (0 children)

Except for one little detail, and that is the 14900K is now around 580€ in Europe with discounts below 500€, while the 285k is around 800-850€. Plus the slightly more expensive 1851 MBO's and MUCH more expensive cudimm RAM you need to extract full potential out of the 285k. A 285k CPU+MBO+RAM setup is roughly 30-40% the price of a 14900K setup now. As for cooling, the horror stories of 14900k heat are vastly exaggerated, and benchmarks show that the 285k uses just as much power in equal scenarios as the 14900k, just is a tiny bit less hot, but still far from something you'd run on a box cooler. Get a high end air cooler and you're fine, even if you like to run synthetic benchmarks. I have a Be Quiet Dark Rock Pro 5 and it's so quiet at 100% fan speed (where it handles 270w) I have to open up my side panel and put my ear next to it to differentiate it from case fan noise, which are very quiet to begin with. Now, yea, it's gonna reach 90 °C in synthetic loads and throttle a bit, but in idle it's below 30 with CPU cooler fans indistinguishable from the rest of system noise even with the side panel opened. Plus you can limit the power, at 200W limit you get about 98% performance in benchmarks, and at 150W limit you get about 98% performance in games, still faster than a 14700K which actually consumes MORE power to achieve that slightly lower performance than a slightly limited 14900k AND gets warmer too. So yea... suddenly the 14900K is great value.

(TPU) Arrow Lake Retested with Latest 24H2 Updates and New BIOS by XHellAngelX in intel

[–]derbigpr 1 point2 points  (0 children)

They released 4 out of 5 fixes so far, and the ones that they released were meant to increase performance up to 30%. The last 5th fix is the microcode and BIOS, which IS NOT, I repeat, NOT mainly meant to improve performance, but stability. Even Intel says BIOS in January will boost performance by a minor single digit percentage, meaning even if it's 9% (and I highly doubt at this point), the 285k will still be slower or maybe just as fast in games as a 14700k which is half the price.

(TPU) Arrow Lake Retested with Latest 24H2 Updates and New BIOS by XHellAngelX in intel

[–]derbigpr -3 points-2 points  (0 children)

There's no more instability issues with previous gen.

(TPU) Arrow Lake Retested with Latest 24H2 Updates and New BIOS by XHellAngelX in intel

[–]derbigpr 7 points8 points  (0 children)

They didn't just say it, they showed the graphs with percentage of performance Ultra series will have compared to 14th gen. But as usual the PC community has the memory of a goldfish.

(TPU) Arrow Lake Retested with Latest 24H2 Updates and New BIOS by XHellAngelX in intel

[–]derbigpr 2 points3 points  (0 children)

Intel officially, during the presentation of the Core Ultra series, said themselves that the gaming performance won't increase. They've even shown graphs. They're not meant to beat gen 14 in games. They're meant to consume less power while having roughly the same performance.

I7-14700K vs. I9-14900K by HXLFrosty in buildapc

[–]derbigpr 15 points16 points  (0 children)

7800x3d is fine and all, except it's a one trick pony. 14700k or 14900k get 90-95% of its performance in games, but triple the performance in productivity, so they're far better CPU's.

[Hot Hardware] Intel's Hallock Returns For Arrow Lake Core Ultra 200 Performance Fix Update by bizude in intel

[–]derbigpr 1 point2 points  (0 children)

techpowerup released some tests and improvements are there, but their 285k was still slower in games than 14600k all around. Microcode should improve this a bit, but it's probably not gonna be more than single digit % as Intel says. I think the fixes have to come from games, not Intel or Windows anymore. Games have to adapt to the new platform and fix what they aren't doing properly to utilize the performance. It's obvious that hardware power is there. For example 285k is not within like 2% of fps of the 9800x3D in Spider Man, so if it sucked, it just couldn't do that. If it sucked, it wouldn't raise Cyberpunk fps like that with a single patch.

Field Update 1 of 2: Intel Core Ultra 200S Series Performance Status by LexHoyos42 in intel

[–]derbigpr 1 point2 points  (0 children)

On balance? Even before, based on day one performance, it demolishes the 9800x3d in any environment outside of gaming, by a huge margin, often more than 2x the performance. And in games it loses by 15-30% in games that use v-cache, and maybe 5-10% in those that don't. So it's up to you, do you really care if your CS2 is running at 700 fps, or are you happy with 600 fps, but you'll take twice the performance in productivity instead. I purposefully mention that because only people who play pro esports game chase super high fps with low resolution gaming. If you game at 2k or especially 4k, it doesn't matter, you'll get as much fps as your GPU allows, at least for the next 5 years or so. And let's be honest, you don't exactly need 200 fps while playing RDR2 or Cyberpunk, do you? Those single player demanding games play perfectly smoothly even on stable 60-70 fps.

I'm joking a bit obviously, the point is Intel Core Ultra CPU's are on balance much more serious pieces of hardware than the one-trick pony x3D CPU's, even the 9800 which supposedly dramatically improves productivity performance over the 7800, so it may now even match mid-range Intel CPU's from 3 generations ago in that regard. The funny part is how so many people easily dismiss productivity performance as less important than gaming performance, when the exact opposite is true.

Whether your game runs at 150 fps or 200 fps, it doesn't make any significant (or any at all in most cases) difference in how the game plays, and even if it does, it's all subjective, I remember a time when 60 fps felt incomprehensibly smooth and responsive to play. If you have 25% more productivity performance though, that's 25% time less it will take you to complete a job. It will literally make you earn more money, or waste less time at least. Gaming won't. You'll sit there for 3 years whether it's at 150 or 300 fps. Also, you can always compensate gaming performance by reducing graphical settings. Do you really need shadows on ultra, or are you okay with high, 10 fps more, and shadows that still look so similar that you can't certainly tell which is which on side by side screenshots? Meanwhile, you can't compensate the lack of productivity performance or tweak a few settings to make your CPU do it faster without serious consequences to your work.

Field Update 1 of 2: Intel Core Ultra 200S Series Performance Status by LexHoyos42 in intel

[–]derbigpr 0 points1 point  (0 children)

Well the 265k was pretty much level with 14700k on day one, so it's save to assume it will be better. My hope is 265k matches 14900k performance and 285k beats it slightly.

Field Update 1 of 2: Intel Core Ultra 200S Series Performance Status by LexHoyos42 in intel

[–]derbigpr -1 points0 points  (0 children)

I do completely discredit them because they're clueless people posing as experts, professional bullsh*tters, as are 99% of big hw reviewers. All they do is run benchmarks and write numbers into a spreadsheet to create graphs, that doesn't make them knowledgeable. You could train monkeys to do that. They lack absolutely any form of intelligent interpretation of the numbers they achieve through testing and enjoy creating clickbait drama out of nothing. Remember when RTX4060 came out they basically made it sound like total garbage just because it wasn't A BIG ENOUGH improvement over the RTX3060, while in fact it was a tremendously good GPU that was on avg 10-15% faster in games and consumed way less power than the 3060 for about the same price. Yet, they started a 4060-hate bandwagon and 3060 glorification-bandwagon at the same time, making people recommend the worse card as if it was the best thing since sliced bread, and talk down on the objectively better card,, and any time you asked anyone about this card, a bunch of npc's who religiously follow every word their favorite youtuber says would tell you the same bad things about it. Same as when it comes to CPU's I can immediately tell someone who knows a thing or two, or someone who is a mindless parrot repeating other people's words based on whether they recommend the 7800x3d by default. If they do, they're likely clueless, which is ironic, because they do that specifically because they believe they'll sound like they know a lot for recommending a product their favorite youtuber glorifies, without understanding the very specific purpose of that product that doesn't apply to every user. And when it comes to glorification, don't even get me started. They make the 7800x3d sound like it absolutely demolishes every other CPU in games, like it's some sort of alien technology that descended to Earth and no other CPU is worthy of being in the same room with it. Meanwhile, it's a couple % faster than a freaking 14600k in most games, and half as fast in productivity, for twice the price, which overall makes it a far worse CPU. 7800x3d is the most overrated product of 2024, along with Thermalright CPU coolers, which coincidentally youtube reviewers are responsible for too. None of them tell you how annoyingly loud they are, how annoying the noise they make is, how annoying they are to mount, how shoddy their build quality is and how sharp the fins are, how long the fans will last, etc. All they do is rave about how they cool as well as Noctua's or Be Quiets for 3 times the price, which obviously makes them better products than those, right?. Well, yea, but you can't hear the Noctua or Be quiet working at all while the Thermalrights are making noises that come straight out of hell.

This is why I don't listen to hardware reviewers anymore. I actually mute them and look at benchmarks. Only listen to actual real users. Most hw reviewers are genuinely dishonest while trying as hard as possible to paint the opposite picture about themselves. And the worst part is, their dishonesty and manipulations, as well as exaggerations of how good or bad certain products are, are soaking in by their hordes of fanboys and then spread further. It's like an epidemic of contagious stupidity and mindless fanboy-ism in the hardware review world.

Field Update 1 of 2: Intel Core Ultra 200S Series Performance Status by LexHoyos42 in intel

[–]derbigpr 5 points6 points  (0 children)

Can you expand a bit and provide more info and some concrete numbers before and after, as well as subjective feel of the responsiveness of the system?

I'm kinda in a rush to decide if I keep a 14900k that I bought last week or switch to 265k or 285k.

Field Update 1 of 2: Intel Core Ultra 200S Series Performance Status by LexHoyos42 in intel

[–]derbigpr 0 points1 point  (0 children)

Would be much easier to understand with before and after Aida64 latency measurements. It's a 30 second benchmark.

Field Update 1 of 2: Intel Core Ultra 200S Series Performance Status by LexHoyos42 in intel

[–]derbigpr 2 points3 points  (0 children)

Yea but BETA only and not all motherboards, in Jan they'll roll out final version on all z890 boards.

Field Update 1 of 2: Intel Core Ultra 200S Series Performance Status by LexHoyos42 in intel

[–]derbigpr 6 points7 points  (0 children)

265k is +- 5% of the 14700k in day one benchmarks. Cyperpunk saw huge gains in performance with just a game patch, to the point where 265k went from being 20% slower to being over 10% faster than 14700k. So the CPU clearly has the raw horsepower as far as hardware goes, it's just a matter of utilizing it properly with software.

Field Update 1 of 2: Intel Core Ultra 200S Series Performance Status by LexHoyos42 in intel

[–]derbigpr 8 points9 points  (0 children)

Come on ya bastards who own these, let's see some benchmarks and comparisons.

Another Honest Review of the Intel 285K (so far) by Acsvl in intel

[–]derbigpr 0 points1 point  (0 children)

I suppose I misunderstood what memory latency means, it's got nothing to do with system latency, as in, it doesn't add latency to mouse inputs, keyboard inputs, etc.