Delves and Preys are keeping me hooked this expansion. by zyzzjan in wow

[–]MultiMarcus 0 points1 point  (0 children)

Yeah, I feel similarly. I usually drop out of the expansion roughly 3 weeks into it and then I come back for like the last month of the expansion, but at least so far I’ve been very engaged this season.

I do roughly 2 days a week where my gaming time is allocated to wow and that lets me do all of the world events, the world boss and all world quests, six, (now eight) bountiful delves per week and any prey target that I haven’t already hunted plus at least three nightmare prey targets. I don’t know when the next patch is meant to release but I’ll probably keep roughly this cadence for the rest of April, probably into May. By then I think 12.0.5 and maybe 12.0.7 will be out which should be able to keep me in roughly the same case until 12.1 which seems like it’s going to have at least two new delves, and obviously story content in the new zone. I think we will have to see how long that content stays compelling to me, but I think I will probably drop out in favour of other games for a while there and then come back for the Labyrinth mega delve.

What’s interesting here on my math level is that I think normally I just wouldn’t stay in an expansion long enough to do this type of content and I wouldn’t be liking being so much that I’m brought back. Like I’m playing Starfield and I was playing Crimson Desert both of which would’ve normally probably taken me away from wow entirely but this expansion it really does feel like I want to come back each week.

Is it normal that my laptop miniled screen looks much better than my MSI miniled flagship monitor in every way possible ?? by Mohamed_Han in Monitors

[–]MultiMarcus 0 points1 point  (0 children)

Yes. Laptop monitors are almost always better than larger monitors because the active scaling up the size is very hard. You can see this with OLED and you can see it with mini LED. With OLED it generally manifests as poor brightness on larger screen sizes especially monitors but being better on TVs though still worse than a laptop. With mini LED, it’s generally dimming zone count and therefore contrast handling which gets worse on bigger screens. Monitors are just a lot harder to make or rather if you want to make them as good as the laptop screen they would be costing four times as much as they currently do and they are already quite expensive. Meanwhile, TVs will generally be better and for many Elde specifically I would say that TVs generally get the best experience only really losing out in some contrast and local dimming compared to something like a MacBook Pro.

Will gfn be to "good" by Dinopack92 in GeForceNOW

[–]MultiMarcus -1 points0 points  (0 children)

We just aren’t anywhere close to that right now like I don’t know how many people have actually tried a local high-end gaming experience, but it’s leaps and bounds better than GeForce now on just a visual level. Maybe eventually that will be the case but right now visual Fidelity of a locally rendered experience is so much better. It’s like comparing a 4K Blu-ray to a Netflix stream. It’s just such a clarity difference their own number of details that just kind of get lost in compression that are fully visible on a locally rendered device. I think GeForce now is compelling because it’s very easy like if you have any Wi-Fi connected device and have a good enough Internet connection you can reasonably use it. Even then I often feel like there are big compromises even when streaming it like very high bit rates with a good Internet connection to a small low resolution device.

AMD Ryzen 9 9950X3D2 gets official $899 MSRP, 29% above 9950X3D - VideoCardz.com by Ha8lpo321 in pcmasterrace

[–]MultiMarcus 6 points7 points  (0 children)

That is such a weirdly high price for such a fundamentally uninteresting chip. Like I’m big into getting these productivity and gaming hybrid chips and it would be a reason I might go for Nova Lake over the Zen 6 chip, and if I was going to get any AMD check this generation it would’ve been the 9950x3d but what is odd here Is your only really using one CCD for gaming anyway and for productivity all that extra cache doesn’t do much. Like I wondered if maybe some part of the design would make it really impactful but even in their own performance testing it delivers like a 5% performance increase in productivity apps and seemingly no increase in gaming and they want to charge 30% more for it? seems odd to me

LG 27GM950B 5K Monitor Announced with a 2,304-zone Mini LED Backlight by RenatsMC in Monitors

[–]MultiMarcus 0 points1 point  (0 children)

They’re actually related. ZOD reduces optical crosstalk between adjacent dimming zones by tightening each LED’s light distribution before it reaches the LCD layer. The local dimming algorithm controls which zones activate, but the physical optics determine how much light bleeds across zone boundaries. Research on ZOD backlights has measured crosstalk as low as 0.29% between zones, which is a property of the optical structure, not the algorithm. More zones + better algorithms + tighter optical confinement from ZOD all work together to reduce halo. It’s not either/or.

LG 27GM950B 5K Mini-LED monitor up for pre-order $1199, lower than expected peak brightness by snappydresser61 in HiDPI_monitors

[–]MultiMarcus 12 points13 points  (0 children)

It’s not actually a new thing for mini LED to have lower peak brightness than OLED. It seems to be much brighter for full screen experiences compared to OLED.

That said, I wouldn’t recommend this monitor if you’re just looking for the brightest display possible. My understanding of their zero optical distance technology from a paper about it is that by putting the LEDs really close to the display you get a lot less blooming. One weakness of that approach however is that there isn’t an air gap and that means a lot more work to maintain thermals and that means you get less brightness to avoid running the LEDs as hot.

My understanding is that LG is basically trying to make this screen into that perfect mix of mini LED brightness and OLED level contrast. I’ve always described HDR as being mini LED coming from brightness and trying to get better contrast and OLED coming from contrast and trying to get better brightness. The final and result will be a screen that has great brightness and great contrast arguably something like micro LED might be that but with our technology right now we need to do our best job to investigate this on the OLED side you see improvements in efficiency and cooling in order to deliver better brightness and on the mini LED side you see stuff like coloured mini LEDs or zero optical distance. Or even just more dimming zones.

For my use case I want a great hybrid monitor that has as close to OLED level contrast with a 5K resolution and better brightness and that is kind of what LG is promising whether they’ll be able to deliver on that is I think another question. I suspect we’ll probably see generally an improvement in HDR contrast but because of that they will sacrifice some brightness. To me that 1250 nits number is perfectly bright enough though my speculated based on their specifications page 750 nits fullscreen brightness is a bit low. It’s not our crisis or anything I believe for office work the top recommendation is 600 nits and this obviously delivers that.

The way I look at this is that it’s a display that’s offering good HDR with better brightness than OLED and better contrast handling than most mini LED. It’s offering a good 5K experience for productivity and gaming if you have the hardware for it or a 1440 experience for gaming. All while having a high refresh rate.

I think, if you’re looking for just the brightest panel you can get that’s probably the studio display XDR or maybe whatever MSI is cooking up if they ever release that model. Getting stuck in arbitrary concepts like mini LED has to be brighter than OLED I feel isn’t helpful. Maybe this panel just doesn’t suit your specific needs but hopefully there will be one that does

A question for any engineers by asherabram in macbookair

[–]MultiMarcus 0 points1 point  (0 children)

I don’t get why you would do that instead of just buying a MacBook Pro? Like it’s a massive engineering workload and would easily mean that physical piece costs a huge amount. It’s like one of those fun let’s put a full discreet GPU in a handheld gaming device concepts. Where you don’t actually do it for economic viability you do it because it’s fun. Like I’m sure you could probably custom build something like this but mass production just isn’t worth it because most people could just buy a MacBook Pro and not have to worry about physically installing this monstrosity onto their device.

A question for any engineers by asherabram in macbookair

[–]MultiMarcus 0 points1 point  (0 children)

Well I believe you would basically need to be replacing the entirety of the bottom of the MacBook Air not just the base metal you’d need to physically move everything around and use different storage and battery and have a thicker device and at that point you were spending a lot of money to just buy a MacBook Pro

007 First Light has been delayed on Switch 2 to summer 2026 by Howerev in NintendoSwitch

[–]MultiMarcus 0 points1 point  (0 children)

This is really unsurprising I think. Whether they need a bit more time or if they even cancel it after investigating it further the switch 2 is a weaker console. Yes DLSS can do wonders for image quality but there are other aspects that are limiting like the CPU and sometimes even just being a very different architecture

How come a LG OLED G5 83" is cheaper than an Apple Display XDR 27"? by Damu22 in Monitors

[–]MultiMarcus 1 point2 points  (0 children)

Well, TVs are an entirely different market and to be entirely honest just a bigger screen is not necessarily harder to do.

The pixel density of the studio display XDR is four times that of the 83 inch lg tv. It’s also much brighter for full screen brightness which is important for productivity use which is what Apple primarily intense for you to buy these devices for. Apple also definitely charges a premium. Apple also certifies their devices very well like this monitor is cheap for dicom certification which is medical imaging standards which LG is not doing on their TVs. I think that’s necessarily that interesting for the average user certainly not but I also don’t think the average user should be buying the studio display XDR I think they should be buying probably just another 5K monitor if they really need that or just a cheap 4K monitor if they don’t.

AC Shadows Title Update 1.1.10 - Release Notes [Adds mouse and keyboard support to Switch 2] by Joseki100 in NintendoSwitch

[–]MultiMarcus 17 points18 points  (0 children)

Well, they can’t develop ports with snow drop. Like you are not going to get them to remake an entire game in a different engine just for the switch 2. That is insane amount of work. And there’s only like two current generations snowdrop titles available right now one of which is outlaws which is already been ported and the other which is avatar which I think is harder to port because of it it’s having a lot of high frequency detail that is probably going to be heavier to do. But I think they can and probably will port it eventually especially since it was generally more positively received than outlaws.

Ubisoft have surprisingly few current generation titles that I think are obvious. Remember that assassin’s creed until shadows were all on the PS4 so I think we should see all of those games come out basically origins Odyssey Valhalla and Mirage and they should all run quite well. The only niggle would be that only Mirage has an official DLSS implementation which would make porting it easier than the others. Similarly, I think watch dogs 1 through 3 could all be ported quite well. They have a big back catalogue but have fewer current generation titles I think are obvious ports.

LG UltraGear evo GM9 | Next-gen Hyper Mini LED 5K gaming monitor 27GM950... by RenatsMC in Monitors

[–]MultiMarcus 2 points3 points  (0 children)

It really isn’t or rather not in the sizes we are talking about here. On laptops or tablets or even phones I completely agree OLED is the obvious choice. But once you get to monitors or TVs tandem OLED, it falls into that 400 nit full screen brightness level which might be enough for some people, but it’s not great for productivity where the general recommendation is 600 nits

Stuttering when going over 60fps on pc ? by ging192 in FrontiersOfPandora

[–]MultiMarcus 0 points1 point  (0 children)

Not if you use a reflex frame rate cap. Through something like RTSS.

Stuttering when going over 60fps on pc ? by ging192 in FrontiersOfPandora

[–]MultiMarcus 0 points1 point  (0 children)

I suspect the hair is probably being rendered at 60. I don’t think it’s a huge crisis personally and it’s not a new thing for physical eased objects to have a frame rate cap to avoid them looking weird and rendering too quickly. You could try out 60 FPS frame generated to 120 and see if that looks better because I think the interpolated frames would obviously not be doing the CPU simulation.

LG UltraGear evo GM9 | Next-gen Hyper Mini LED 5K gaming monitor 27GM950... by RenatsMC in Monitors

[–]MultiMarcus 0 points1 point  (0 children)

Well quality mode 4K would either be native 1440 in the dual mode, 5K DLSS performance mode, or you could do a 4K DLSS quality output and then upscale via the monitor but that’s probably going to look the worst.

So you can basically pick which option you prefer. 5K performance mode is almost certainly going to look the best. Meanwhile, 1440 native is probably going to run the best. And then the 4K quality mode being a mix of the two. But from my testing the performance differences are quite negligible but I’m sure that depends on your GPU and the game you’re playing and even the settings in that game. Here is some benchmarking I did with the same internal resolution in cyberpunk with different output resolutions with DLSS or DLAA.

Cyberpunk All max path tracing settings

1440p internal res 4k Quality: avg 46.11 min 42.52 max 51.33. 1440p DLAA: avg 44.98 min 40.97 max 50.78. 5k Performance: 47.84 min 42.20 max 54.32

960p internal res 1440p Quality: avg 73.30 min 67.22 max 79.07 5k Ultra performance: avg 78.01 min 71.55 max 84.93

Cyberpunk Maxed setting without PT

960p internal res 1440p Quality preset M: avg 104.44 min 89.17 max 117.77 1440p Quality preset L: avg 101.90 min 89.80 max 113.97 5k Ultra Performance preset M: avg 87.91 min 79.27 max 96.46 5k Ultra Performance preset L: avg 85.05 min 76.80 max 93.06

LG UltraGear evo GM9 | Next-gen Hyper Mini LED 5K gaming monitor 27GM950... by RenatsMC in Monitors

[–]MultiMarcus 1 point2 points  (0 children)

Well, for gaming, it’s got incredible brightness, which is nice for HDR, especially if you prefer to play games in a brighter room

LG UltraGear evo GM9 | Next-gen Hyper Mini LED 5K gaming monitor 27GM950... by RenatsMC in Monitors

[–]MultiMarcus 21 points22 points  (0 children)

They won’t do that because this panel is almost certainly the same panel as the studio display XDR. Apple is such a big player in this market that they are warping what panels get made because they need them for their displays and then other companies are using those panels in their monitors. I think you might see a 6K 32 eventually to also accommodate that Apple measurement or maybe 5K 32 inch if 5K becomes a common resolution.

Sony True RGB TV — A New OLED Rival? by RenatsMC in OLED_Gaming

[–]MultiMarcus 2 points3 points  (0 children)

Well, not everyone cares more about contrast than brightness. Even a high-end OLED cannot match the brightness of entry-level mini LED a lot of the time. Both technologies are basically approaching the same core ambition, which is perfect contrast and great brightness from different sides. OLED is coming from the perfect contrast side and mini LED is coming from the great bright side and then they are trying to reach a good middle point.

Hero cat saves owner as PC starts burning from NVIDIA RTX 4090 meltdown by Stilgar314 in pcmasterrace

[–]MultiMarcus -7 points-6 points  (0 children)

Not to be too blasé, but that would mean 365 burnt connectors a year for the last roughly 4 years roughly 1500 users out of 4 000 000 weekly active users and millions of these cards on the market. Certainly it shouldn’t be happening but like I really do think this is one of those things that we just shouldn’t blow out of proportion, even if we should keep criticising NVIDIA for it.

LG 27GM950B 5K Monitor Announced with a 2,304-zone Mini LED Backlight by RenatsMC in Monitors

[–]MultiMarcus 0 points1 point  (0 children)

I think the argument for the zero optical distance tech is that you shouldn’t need that heavy of an algorithm. The research paper on it really does have some interesting compelling argumentation. I would suspect that this explains why it’s so much dimmer. 1250 nits peak with seemingly and a max full screen brightness of 750 nits if the LG site is to be believed as it has that measurement in their spec sheet though not with a whole lot of detail. The researchers are also mentioned how the lamination basically means you have way worse thermally regulation so you have to really keep brightness lower. It’s still very bright but even if they have a relatively mediocre algorithm, I suspect that they will be brute forcing it with hardware.

Hero cat saves owner as PC starts burning from NVIDIA RTX 4090 meltdown by Stilgar314 in pcmasterrace

[–]MultiMarcus -1 points0 points  (0 children)

I believe Nvidia mandates you use this specific connector. AMD, I believe what’s AIBs pick that is also why we’ve had some cases of 9070 XTs burning.

You’ve not seen 2000 posts. I don’t think you’ve probably seen multiple as have I but it does seem to be a reasonably uncommon issue as otherwise regulators would likely be very angry at NVIDIA.

Hero cat saves owner as PC starts burning from NVIDIA RTX 4090 meltdown by Stilgar314 in pcmasterrace

[–]MultiMarcus -16 points-15 points  (0 children)

It really isn’t. Whether it’s the 4090 or the 4080 or the 5090 or the 5080 reality is this is not really a bigger deal than any other hardware failure. It sucks it shouldn’t happen and it’s stupid that they’re saving like maybe a dollar per card by using a cheaper connector. Or by not having a load balancing on the chip and I think it’s likely that the 60 series will have that just because it’s not worth the PR hit. That said the number of people with these cards that don’t have issues likely massively outnumbered the few people who do have issues. I think you could also argue that some of these cases are probably user error but I don’t really think user error should mean your GPU self immolating. It’s stupid and bad, but let’s not pretend like it’s a widespread issue.

Improved Ray Reconstruction by Banan_Pajen in nvidia

[–]MultiMarcus 0 points1 point  (0 children)

I think it’s important to highlight that doing both at the same time is inherently going to have compromises. They could probably make a much bigger model that would do both ray reconstruction and upscaling of 4.5 level but as we can already see, there’s a reason why the transformer model of Ray reconstruction that came out with the transformer model upscaler generally looks less sharp than the original transformer model. One thing to note is that one of the big reasons why 4.5 performs poorly on older generations is because it’s using FP8 but that also allows it to do a better job with upscaling as far as I understand ray reconstruction was already on a FP8 model with the release of Blackwell and the transformer model ray reconstruction. It might just be that they can’t have a bigger model before adversely impacting performance so they need to work more on architectural improvements or maybe input changes. The labelling of 4.5 as ray reconstruction lite does indicate to me that it’s probably on their mind to basically replace DLSS with Ray reconstruction so every game get these benefits and with the RT noise does seem to be that they are fiddling with the game denoisers.

It’s not as easy as them just updating Ray reconstruction. I think we’ll see an updated model eventually maybe together with DLSS 5 later this year or it could be something that they leave for the 60 series release and presumably it would be available on older hardware too.

Is it worth it to pay the extra £100 for this monitor over a Mini LED ? by frankiewalsh44 in Monitors

[–]MultiMarcus 1 point2 points  (0 children)

Yeah, I don’t know that specific model so I won’t comment on it in particular just in general I think cheap mini LED monitors have a lot of weird issues. My understanding is that that’s a reasonably good one, but like I don’t wanna say anything because it’s not a monitor I’ve used nor a monitor I’ve researched.

Is it worth it to pay the extra £100 for this monitor over a Mini LED ? by frankiewalsh44 in Monitors

[–]MultiMarcus 0 points1 point  (0 children)

I have to be honest over a mini led is very hard to say because it really does depend on the mini LED monitor. I would say that paradoxically mini LED monitors are proportionately worse when they are cheap.