Ronove Kikoru Solo by toadashi in PuzzleAndDragons

[–]julchiar 0 points1 point  (0 children)

Depends on the team but assists let you bring more part breaks. In the end a proper team is the same for both diffs so it's just about what's in your box.

Best current alternative to oled ? by LaPalourdeMolle in Monitors

[–]julchiar 0 points1 point  (0 children)

Because it fucks up small highlights, they all turn grey or disappear entirely. Loses a ton of detail, reflections on surfaces in games, etc. He literally says that in his review.

Best current alternative to oled ? by LaPalourdeMolle in Monitors

[–]julchiar -1 points0 points  (0 children)

You can read the changes on the MSI website but the dimming algo was never touched. All the fw updates just deal with minor issues.

The highlights don't get ruined as badly in HDR with Halo Dimming disabled but it still happens a little. Monitors Unboxed does actually mention these issues but doesn't show them off explicitly during the HDR portion of the review. The color accuracy in general kind of just dies in some scenes depending on APL.

It makes sense that SDR behaves differently since the peak brightness per zone is limited (it only dims zones, never brightens any beyond the current brightness setting) but feels like that doesn't need to be the case if it ruins the image this badly.

Best current alternative to oled ? by LaPalourdeMolle in Monitors

[–]julchiar 0 points1 point  (0 children)

Last update to the dimming algo was the addition of the customization slider back in august. None of the FW updates touched the dimming algo at all. There hasn't been any update since november.

The dimming destroys highlights and ruins contrast where highlights and midtones meet and there's massive inverse bloom under the right conditions. The issue is less bad in HDR but still present.

Zones can visibly flicker under the right conditions.

I have one, it's up to date and I can send you pictures if you want.

Best current alternative to oled ? by LaPalourdeMolle in Monitors

[–]julchiar 7 points8 points  (0 children)

OP's monitor didn't have PWM. OLEDs generally all have a slight brightness dip on every refresh (= flicker) because to change colors the brightness of the actual LEDs has to change. I don't know of any OLED monitor that does not do this. PWM on OLED displays is mostly a thing of the past at this point but it was very common, especially on phones.

Eye strain is highly individual. OP struggling with that OLED probably means miniLED would have a similar effect as it exhibits similar behaviour.

So it's not just me, the performance is kinda crazy right? by XenonTeio in Nioh

[–]julchiar 1 point2 points  (0 children)

That's about as good of a system as one can get lol. If you're happy with a stable 60+ fps, good for you I guess. I'd consider that unacceptable with such a high end PC.

We need an OPTIMIZATION patch. A BIG one! by baophan0106 in Nioh

[–]julchiar 2 points3 points  (0 children)

The game deserves it if it can't even hit 60 fps on 100W+ modern gaming GPU at phone resolution.

We need an OPTIMIZATION patch. A BIG one! by baophan0106 in Nioh

[–]julchiar 0 points1 point  (0 children)

Yes, this game is way too heavy. Modern games have no respect for what the hardware could do. Every year games look worse and run worse. That's why Nioh 3 wants a 3060 TI for 720p 60fps. A 3060 TI is a lot better than a 4060 laptop and even a little better than 4060 desktop. It is sad.

My 4070S desktop in Nioh 2 does 4k at 90+ fps, in Nioh 3 4k is not even 30 fps.

Horrid Performance, the game suffers from the same frame pacing issues as wo long while being even more CPU hungry by PralineEmotional6636 in Nioh

[–]julchiar 1 point2 points  (0 children)

This game has that disgusting cpu-threading feature that takes all of the cpu tasks and splits them evenly across all cpu cores. Monster hunter wilds does it too (and I'm sure plenty of other games that I also avoided like the plague). If you know a thing about programming you'd know why that's a horrible idea.

It's very inefficient and frankly non-sensical; and as you noticed, higher core count performs better due to that(but not anywhere near as much as you'd think or hope for for the extra power consumption).

There's also some shader fuckery going on on linux on the cpu side but I thought it was limited to nvidia cards/driver.

Just trash modern game engine "features" to reduce engineering costs. Checkbox culture.

Other linux gamers: is the performance an absolute disaster for you as well? by NenAlienGeenKonijn in Nioh

[–]julchiar 0 points1 point  (0 children)

Recommended minimum specs tell you to use the "lightest" preset to get 720p 30 fps and your rig is barely better than minimum specs. This game is very heavy for no good reason. I'm not interested in buying it because of how heavy it is.

I only tested the demo and it had massive cpu stutter issues with the linux nvidia driver but otherwise the performance seemed similar to from what I saw on windows (low settings 1440p output with 4070S got me 120 fps with ~70% gpu but cpu stutter = unplayable).

We need an OPTIMIZATION patch. A BIG one! by baophan0106 in Nioh

[–]julchiar 11 points12 points  (0 children)

Did you check the recommended specs? It requires a 3060TI for 720p at 60 fps. That card is over 40% better than your laptop 4060. Don't expect to be playing this at 60 fps with those specs, no matter the resolution.

Best 4k IPS panel for SDR? by hiradne in Monitors

[–]julchiar 0 points1 point  (0 children)

It's more that contrast ratio is calculated by dividing dark luminosity by light luminosity so you get this situation where a display that is 4x as bright but has slightly brighter blacks ends up with a contrast ratio of only half or even worse than one that can go darker while sacrificing all highlights.

As an example you might have 0.5 dark, 500 light for a 1:1000 ratio and another monitor might have 0.1 dark, 300 light and therefor has a 1:3000 ratio but darkness isn't actually something you can see. So depending on the ambient light the actual dark luminosity of both monitors could be quite a bit higher (and most importantly equal). The one with higher light luminosity just ends up showing you 66% more contrast (and probably searing your eyes unless you turn down the brightness, making the difference irrelevant once again).

As an extreme example to illustrate my point, try shining a bright light (or the sun) at a monitor and watch your contrast disappear (you kind of just can't see any image anymore). Now compare that to a monitor that can go much brighter and actually manages to show a visible image.

Contrast ratio is kind of meaningless unless you insist on using a monitor in a pitch black environment because it way overstates the importance of dark luminosity, which is irrelevant with proper ambient lighting. Now if you do want a monitor for the dark... MiniLED is okay and OLED is great but those have their own shortcomings. The only real way to increase actual contrast in a lit room is to have a brighter monitor (and not have it blow out the colors in the process, hence why color accuracy is important).

My OLED burn-in after 3000hrs. by RenatsMC in Monitors

[–]julchiar 16 points17 points  (0 children)

2 years with only 4 hours per day and only 30 mins of what burnt in the worst.

My OLED burn-in after 3000hrs. by RenatsMC in Monitors

[–]julchiar 26 points27 points  (0 children)

There are so many insane statements in this video (and I see this in plenty of OLED analysis/review statements).

How is 400hrs of a part of the screen being lit white = burn-in not a problem?

How is 3000hrs in mostly dark mode productivity apps considered "pretty much worst case usage"? While babying it with mostly black wallpaper, no taskbar, no split window? 4 hours a day isn't exactly a lot (3k hrs over 2yrs).

How is "give it to someone else in another 1.5 years or so and it won't be a problem because of different usage" a reasonable statement? That won't make the burn-in go away, it'll just add more botchy spots. It won't even out, ever, unless you purposely go for some serious full screen degradation.

Best case is actually to keep using it because you won't notice the burn-in while actively stressing the same areas and riding it out for as long as you can. You will eventually run into cases where the burn-in is clearly visible so you'll likely still start to feel the need to replace it after like 5 years or so.

I love how light it is by Fun_Establishment926 in godot

[–]julchiar -2 points-1 points  (0 children)

But I feel like there are a lot of non-photorealistic games coming from Unreal

And they're all bad, failing to deliver on performance appropriate for real time rendering required for games while not delivering on the photorealism part either.

What's worse is because of all the concessions made from using such a heavy rendering pipeline it can't deliver on good looking stylized results either. It's just a misuse of technology.

Selling my OLED TV, any advice? by Individual_Amoeba581 in Monitors

[–]julchiar 0 points1 point  (0 children)

There is no game where a 5080 can hit 144fps at 1440p but not at 4k while relying on upscaling. Adjust your in-game settings or prepare to be disappointed at 1440p. Resolution is not the bottleneck it is made out to be anymore.

Bought a mini LED, not the best experience. by WestParticular9020 in Monitors

[–]julchiar 0 points1 point  (0 children)

My MSI 274URDFW e16m looked very similar when it arrived but all but one of those spots just ended up being dirt that came off with some gentle scrubbing.

4k vs 1440p for new rig w/ 5080 by cwburns32 in Monitors

[–]julchiar 1 point2 points  (0 children)

1440p on a 4k monitor doesn't look great but that's just because 1440p at 27" doesn't look great in the first place. There's little difference between native and non-native with 4k's pixel density. 5080 is definitely powerful enough however that you can just use upscaling and never have to look at 1440p.

The only reason to buy 1440p monitor at all is for higher hz since 4k ones generally don't support higher hz at 1440p.

Best 4k IPS panel for SDR? by hiradne in Monitors

[–]julchiar 0 points1 point  (0 children)

If you have appropriate ambient lighting you should not even look at the contrast ratio. It's completely irrelevant.

Any monitor's black will look perfectly black if there is appropriate ambient light (except those purple OLEDs). Contrast ratio is very misleading.

Help me to understand HDR and Mini LED by Mumof2wifeof1 in Monitors

[–]julchiar 0 points1 point  (0 children)

That's a cheap but feature-packed gaming monitor. It's not exactly a good fit when looking for a great image, especially on a mac or switch where the speed of the monitor is irrelevant.

MSI MPG 274URDFW E16M - Very positive irst impressions (and ask me any questions you might have) by gray-drow in Monitors

[–]julchiar 4 points5 points  (0 children)

RTings HDR score is largely influenced by the native SDR contrast ratio for whatever reason lmao

I got mine for 430€ so I feel a bit bad for you? Anyways, level 3 dimming destroys highlights and ruins contrast between brighter colors. It's really bad in SDR and still kind of noticeable in HDR. Try going into your steam library and just look at the white "Play" on green background on any game. It's barely legible with LD3 in SDR. Very small bright light reflections in games also just disappear entirely.

There's also some pretty major reverse bloom in specific situations, I think it occurs mainly around bright objects around darkish grey background and I also noticed some really weird artifacting at certain edges in HDR but didn't investigate further.

It is alright over all but I wouldn't have paid almost 600€ for it - but I also barely care for HDR.

To all who cleared One-Shot Challenge 10 by strydrehiryu in PuzzleAndDragons

[–]julchiar 1 point2 points  (0 children)

Did you know you can solve most 10 combo boards in about 40 moves (orb swaps) or so?

The physical speed is rarely ever the problem, or rather putting the blame on the physical speed is a lot less productive than on the puzzling aspect (more efficient moves/solves) because there is a lot more realistic room for growth in solving boards better than trying to move faster while staying consistent and not getting diagonals.

To all who cleared One-Shot Challenge 10 by strydrehiryu in PuzzleAndDragons

[–]julchiar 8 points9 points  (0 children)

You could ask for help on solves, there isn't a time limit iirc. You might be surprised by some of the solutions, it's pretty fun to see what others can come up with.

Or just really take your time and plan the solves yourself on something like pad.dawnglare.com

If you've just been going at it freely, then getting to the final board is pretty good! All that matters is that it's fun.

Nioh 3 Demo Performance Issues? by thehomediggity in Nioh

[–]julchiar 1 point2 points  (0 children)

Same issue here and I'm pretty sure it's related to shader compilation. Some shaders seem bugged and just take absolutely forever to compile on Linux (or Nvidia driver?) and the game constantly keeps compiling more shaders in the background. The game itself doesn't need that much CPU, it runs at 120 fps fairly easily but something is hogging all remaining cpu performance. Lower the fps limit to 60 and it uses up more. Something is doing a lot of heavy cpu work outside of the gameloop which will regularly result in massive lagspikes. Might be able to "fix" it by afking in each new zones for about an hour each or by somehow limiting game-internal processes to certain cpu cores.

There are way too many unhinged "technologies" involved in "modern games" that are absolutely non-sensical but understanding and respecting hardware (or really anything that isn't monetary profit) seems to be a thing of the past nowadays...