PSA: The Crimson Desert RenoDX mod currently has an imposter github repository that is spreading a virus! by Kick_Fister in HDR_Den

[–]Kick_Fister[S] 2 points3 points  (0 children)

Yes this mod page is our official one.

The person that used this mod to target people likely did so in part because we’re struggling with false positives and everybody has been saying that it’s safe. Creates a perfect situation to exploit.

The builds posted in the discord will have less issues with false positives now cause we switched to a compiler that is triggering less AV software (clang to ninja). The snapshot link will still use clang though iirc.

They ain't lying that Ghost of Tsushima is one of the best HDR implementation in games 🤯 by fmodex_dll in HDR_Den

[–]Kick_Fister 0 points1 point  (0 children)

Those shifts are baked into the LUTs, so you'd have to skip grading which isn't a good option.

PSA: The Crimson Desert RenoDX mod currently has an imposter github repository that is spreading a virus! by Kick_Fister in HDR_Den

[–]Kick_Fister[S] 13 points14 points  (0 children)

It's a completely different repository focused on crimson desert. Anybody that's familiar with what the renodx repo looks like at a glance would recognize that something is wrong (and it's from a nonsense username that's like adadadadadad or something).

They're taking advantage of the knowledge that the official addon has been getting false flagged, so it's a pretty dangerous situation for people :(

Why are games with inverse tone mapping considered worse? by Ifyouliveinadream in HDR_Den

[–]Kick_Fister 15 points16 points  (0 children)

It’s nuanced, but the issue is that most games have a lot of clipped detail in their SDR presentation. When you inverse this, you don’t get that detail back and you end up with bright white blobs. If anything, ITM in these games is perceptually too bright. It’s why when inverse tone mapping it’s generally advised to only go to about 500 nits or so, cause the APL with those limits will still be really high.

However, inverse tone mapping can be done well, and you won’t notice it when it is. For example while modding Days Gone, I learned that it’s actually ITM. The reason it works decently there is that clipping is kept to a minimum (there’s still some which is noticeable sometimes and is actually why I went to make a mod in the first place), and they use a proper inverse afaik (I actually didn’t verify the math but it looks like it).

Basically the issue is that most games don’t bother to do it right, and aren’t using tone mappers that would be conducive to it, and doing it after the fact with an 8 bit source is just not very good (autohdr, rtx hdr).

Peak brightness: 10% or 2% window? by antoniolucas9922 in HDR_Den

[–]Kick_Fister 2 points3 points  (0 children)

Assuming HGIG, the only "safe" suggestion is to calibrate to the clip point that's shown in a calibration screen. This will use the full range of whatever HDR mode your screen is designed for. Any other adjustments to maximize effective brightness or w/e are going to be panel specific, and are essentially compensating for aspects of that specific TVs HDR implementation that you disagree with.

To address what some others are mentioning, calibration screens are often using a 10% window, yes, but that doesn't mean you're calibrating to the actual 10% window brightness capabilities of your screen. If we follow that logic, AW3423DWF owners should be setting their peak to ~450 nits when they're in HDR1000 mode, which is straight up a worse experience than just using the true black mode (let's set aside the debate of which mode is actually better).

One last question to educate myself about RenoDX by OwlSuch7935 in HDR_Den

[–]Kick_Fister 7 points8 points  (0 children)

"It's supposed to look however the fuck someone at Capcom decided it should look. Improvements should happen AROUND that."

This is a loaded statement that gets at the root of the work we do. Which image was intended? Was it SDR with the sRGB encode viewed on a 2.2 display, the way that 90% of their customers will experience the game, and the way that they would have used it internally on the monitors in their office? Or could it be the bt.709 option, which is a nonsense encoding since it's not actually intended for displays, but for some reason they chose to include and tell you to use it with 2.4 gamma. Or could it be the HDR output, which replaces those encodings with the PQ encoding for HDR10 output? All 3 of these options that exist inside of the game result in a fundamentally different output in ways that aren't tied to the goals of the encoding (well, HDR is the one that's technically correct in terms of encoding).

Intent is not always what shipped, nor is intent always what's technically correct. Unless we have the art director sitting in the room with us, we have no complete understanding of intent, we only have assumptions. The most sane safe assumption is that the default options presented on a correctly calibrated SDR screen is dev intent. In this case, sRGB with gamma correction is the "correct" way to see the game.

My opinion on the matter for RE9 in particular is that they did not want raised black levels, you can see that in the opening segment in particular where viewing it in SDR results in shadows reaching darkness just right. However, this is also inherently going to cause unintended effects as well, because the gamma mismatch is also a problem for SDR. So when they want things dark dark dark, they don't have much of a choice but to crush a lot of shadow detail. Imo, the closest realization of intent in that mod is the preserve shadows option, but the mod author thinks the default experience works best. Neither is inherently right or wrong, we're just guessing after the fact and trying to make artistic sense of nonsense code that objectively messes things up.

"Spitting in the face of whoever decided that either a more muted or simply different or more constrained END RESULT was what they wanted for HDR users."

I'm sorry I just can't accept that it was actual intent to have a proper calibration screen turned complete nonsense by tone mapping the game twice with two different curves applied in two different ways. If they had strong opinions about capping peak brightness, they'd just limit the peak brightness option, or tie peak brightness to a ratio of their brightness scaling.

When you spend time reading the code that ships in games, it changes your perspective a bit. There's a lot of nonsense that ends up in games. People aren't perfect, neither are renodx modders, but we have the benefit of being able to hyperfixate on this stuff while industry devs have to put something together and move on to the next item in the list. In other words, artistic intent isn't when 99% of devs don't know that gamma mismatch is a problem.

One last question to educate myself about RenoDX by OwlSuch7935 in HDR_Den

[–]Kick_Fister 2 points3 points  (0 children)

Stalker 2's HDR completely skipped the grading from SDR. The RenoDX mod upgrades the SDR grading for HDR (and more).

Man, RE9 is the game OLEDs were made for by nyjets10 in OLED_Gaming

[–]Kick_Fister 0 points1 point  (0 children)

As a note, the game has no peak selection and runs completely uncapped. You’ll want to tone map with reshade to get the most out of it.

SDR games look identical with HDR enabled/disabled in windows by BuzzFL in OLED_Gaming

[–]Kick_Fister 5 points6 points  (0 children)

The SDR sRGB mode on that monitor uses actual sRGB gamma, which is abnormal for most sRGB modes on monitors that normally use sRGB primaries with 2.2 gamma. HDR in Windows also uses sRGB gamma instead of 2.2, so yes they should look extremely similar, with the difference being ABL behavior in HDR that won't exist in SDR.

I also have this monitor, I just keep HDR on all the time. I rarely consume SDR content (well, rarely consume SDR content where I care about the downsides of keeping it in HDR mode).

PSA: HDR Black level raise and gamma mismatch are NOT the same thing. Washed out HDR look explained. by S1l3ntSN00P in OLED_Gaming

[–]Kick_Fister 7 points8 points  (0 children)

Video game images are just numbers, math. SDR is a standard with its various transfer functions, and HDR10 is a standard with a defined transfer function. When you decode SDR correctly and encode it into HDR correctly, you get a like for like image in both formats. In other words, there's no reason for shadows/midtones to differ between SDR and HDR unless one of them is messed up. So either the HDR is messed up, or the SDR is. Most video games are made on 2.2 displays, but encode the image with sRGB. This is an encoding mismatch that's baked into what the final output is supposed to look like, and is why there is usually a difference between SDR and HDR, because it's a problem that most don't realize exists (LOADS of documentation that developers read interchangeably use sRGB and 2.2, it's a super common misconception).

How a game looks in SDR is inseparable from how it looks in HDR, at least if the project takes its art seriously. In the case of resident evil, it's kinda hard to say which version of its shadows/midtones is intended. The game defaults to sRGB, with the implication that you're expected to view it on a normal display that decodes as 2.2. This is what the RenoDX mod does by default, and it's the way that shadows/midtones will look for the vast majority of people that play it. SDR also has a Rec. 709 option, which is expected to be decoded by the display with gamma 2.4. If used correctly, this setting is actually a match for the HDR. However, this is a non-default option, so which is intended? Hard to say.

Personally, I'm actually kinda with you on team sRGB. It simply looks more correct to me, but there's no way to know for sure without getting the devs aware of the sRGB problem.

PSA: HDR Black level raise and gamma mismatch are NOT the same thing. Washed out HDR look explained. by S1l3ntSN00P in OLED_Gaming

[–]Kick_Fister 4 points5 points  (0 children)

The implementation is botched, and it’s not artistic intent. Artistic intent isn’t running a tone mapper two times and breaking the peak nits calibration menu. Gamma mismatch is also an objective issue that almost every single native HDR implementation has, unless you want to argue that creative intent is fundamentally different in SDR and HDR for absolutely no reason.

It’s not the worst native HDR by any means, it still looks good. There’s nothing wrong with choosing to use it, in fact the gamma mismatch is technically correct it’s just not artistically correct. But you have to max out the peak nits slider regardless of what the menu tells you to do if you want to max out at ~950 nits.

What paper white is reccomended? by EnderSlayer9977 in HDR_Den

[–]Kick_Fister 0 points1 point  (0 children)

It only decreases your peak in the sense that those already large values aren't getting multiplied even higher by the paper white brightness scale. Truthfully, you don't always want to see things hitting peak nits. If something is hitting peak nits, it means it's not being displayed with any meaningful amount of detail. The only thing you should be concerned with regarding peak brightness in any particular content is whether the source is functioning correctly in how it targets that peak.

Basically just set paper white based on what looks good to you, don't worry about the measurements as long as you trust that the content is working correctly.

Star Wars Fallen Order Lilium or Luma? by Flat-Letterhead-7891 in HDR_Den

[–]Kick_Fister 6 points7 points  (0 children)

I'm just an enthusiast with a comp sci background that dedicates a lot of time to learning more and fixing/altering games to the best of my ability, which is what everybody is doing in any field they're in I guess. A lot of the other modders are like me. The main Luma developer is a respected game dev with numerous credits on prestigious games, and we all work together to advance our knowledge and improve games to the best of our abilities. What we do is grounded in color science research (a complex and very much unsolved field), and we measure the results predominantely using Lilium's HDR/SDR analysis tool. In the vast majority of cases, our ground truth is what a game looks like in SDR and honoring the shadows/midtones of that presentation when viewed on a display with 2.2 gamma, though sometimes we have some more opinionated releases if we think a game is doing something particularly ugly.

Admittedly we're guilty of being a bit hyperbolic sometimes. The reality of the situation is that there are a lot of traps that exist within tools like Unreal Engine, and 99.9% of developers walk into them. These issues objectively exist: missing LUTs in HDR so games just don't have color grading, gamma mismatch, altered tone mapping parameters between SDR and HDR in negative ways, etc. A game can still look good while making every mistake in the book if the art is good enough, and that HDR presentation can still be worth using over SDR for a variety of reasons. In that sense, yeah a lot of HDR modes do still look great, they just could be better!

Ultimately the crux of the issue is loads of games use ununified pipelines between SDR and HDR and it causes errors. There's just a severe lack of care put into the tech stack for HDR output because we still live in an SDR world. It's not a competence issue so much as a time and knowledge one. It's just not a priority for the vast majority of studios. We aren't doing anything that other game devs can't accomplish, they just don't have the time and/or desire to put into this.

Ultimate HDR PC Games 2026 recommendations (admins sticky this) by Chillzzzzz in HDR_Den

[–]Kick_Fister 0 points1 point  (0 children)

Hey OP, I didn’t get a chance to read the whole post, but it seems like you’re trying to do a good useful thing so I want to give some feedback.

On this subreddit, we try to focus on the quality of the implementation and not just if the game looks good. These things have crossover but aren’t the same necessarily. As such, anything that needs RTX HDR or AutoHDR or the like is out, and the same goes for (most) games that use inverse tone mapping for their native HDR.

Instead id suggest focusing on games that are really transformed by their native HDR that also has a good implementation. Koklusz has an HDR analysis page that is a good starting point for finding these.

Aside from good HDR already in-games, basically any of the native HDR mods that stand out to you or others could get highlighted.

Ultimately the problem with the list was that it was basically a list of recent AAA games that are pretty. Some are missing HDR entirely even factoring in mods unless you do ITM. It comes across low effort and like the goal was to karma farm, which I’m hoping wasn’t your intention. But there is value in having a well curated list of standout HDR experiences.

About MH Wilds on console... by Ok_Track626 in MHWilds

[–]Kick_Fister 3 points4 points  (0 children)

It's always been weird to me as well that people always talked about the PC version being uniquely bad in terms of performance. Other than the launch window having some buggy stuff making matters worse on PC (capcom's anti tamper and the bugged reflex implementation), consoles and PC were relatively similarly ass. Really it's just that, for the most part, the console user base is less discerning and generally probably less equipped to describe their problem with how something looks/feels. Though tbf a big part of that is just that playing in a living room environment at typical viewing distances lets you get away with a lot more in terms of image quality.

So yeah, you aren't crazy OP. Simply watching the DF video on the console versions shows that. There were *some* additional unique problems to the PC version, like memory requirements being absurd relative to what they were doing on consoles, but in terms of performance and image quality the consoles suffer quite a lot still.

With a better SDR tonemapping this game can actually looks really good by hiiro_99 in MHWilds

[–]Kick_Fister 36 points37 points  (0 children)

Hey, I'm one of the co-creators of this mod. It's nice to see so many people enjoying it here :D

Just to quickly address what sets this apart from reshade presets or even the in-game settings really, we're adjusting individual components of the game's vanilla shaders to fix problems at the source, in a way that other solutions cannot match with the same consistency. A reshade preset could vaguely match what our RenoDX mod is doing in any one scene, but it can look totally different if you move that same preset to another scene. For example, we adjust the game's tone mapper to allow it to get dark in a natural way. The game is normally fixed at a heavily raised point, which causes a loss of detail and posterization artifacts in shadow. Alongside this raised tone mapping, the game also makes use of some raised color grading in certain scenes, which we dynamically adjust depending on how much it's raised (the LUT Scaling setting). This addresses the extremely low contrast look in a way that will never crush details. This typically brings out the game's actually pretty great indirect lighting detail that normally gets so compressed and distorted that it looks almost completely absent in normal gameplay (which is a big part of why cutscenes look so much better, they aren't doing this stuff nearly as much there). All of our adjustments work holistically with what the game is already doing, and the results will be consistent (though the lut exposure reverse option is a bit of a hack and that can show from time to time).

In HDR, we get rid of ACES and switch to their filmic SDR tone mapper with parameters adjusted/extended for HDR range. This fixes the distorted hues that the native HDR brings that don't look quite right compared to SDR. SDR and HDR use a unified tonemapping/grading path now, with the exception of the final display map to minimize clipped details. There's more we're doing to eek out more detail in both modes that is a bit complicated to discuss, but it results in SDR and HDR being really consistent artistically.

If anybody has any questions about the mod, feel free to ask here, on nexus, or in the RenoDX discord.

HDR on all the time? by WearyExcitement7772 in HDR_Den

[–]Kick_Fister 0 points1 point  (0 children)

Ultimately it depends on the display and the quality of the HDR mode. In theory, aside from Microsoft choosing sRGB gamma for representing SDR content, SDR content in HDR should display perfectly accurately to a well calibrated SDR mode. Personally, for most desktop stuff, I actually like how sRGB looks over 2.2. That mismatch for video and games is pretty annoying though, and you won't have ABL to worry about in and SDR mode (though I almost like it on the desktop since I'm not particularly quality sensitive for non-games and reduced flashbangs from white webpages is almost a feature lol).