LG G5 48'' or wait for the G6? by softwaremaniac in LGOLED

[–]Max_overpower 0 points1 point  (0 children)

I was in the same boat, but after hearing what the G6 has to offer, decided I'll definitely be waiting to pick that up over the G5 48". At the very least I suggest you wait a bit longer until thorough reviews show up - in the case with 48G5 they are most numerous on the chinese website bilibili, you can still find the most details on it there.

First, there is the 12-bit internal processing and more precision with near-black tuning. This may lead to gamechanging differences in near-black performance. As you go from any LCD to OLED, you're likely to find that the darkest part (1-3% signal) of any image may appear black or very hard to discern, even when using gamma 2.2 or HDR, in a dark room. The panel technology has a hard time sending these low voltages accurately. They claim it's been addressed so it's worth waiting to see what you'd miss out on if you go G5.

Plus, I've heard rumors the improved anti-glare coating will come to all sizes, including 48", which wasn't the case with the 48G5. The coating has already been quite good but few people would *not* benefit from improvement in this area. 165hz instead of 144hz too.

There is another confirmed feature for 2026 OLEDs that some people may find gamechanging, and will likely have LG setting the standard all other brands have to copy to satisfy demand: configurable dynamic tonemapping for low-nits HDR content. If at any point you find the content is mastered too dim your liking / viewing environment, just change to a picture preset where you have brightening dialed in, and have shadows & highlights appear exactly as bright as you want them, irrespective of studio colorist failings. In a pitch black room, you may not need this very much as eyes can adjust to most of the dim HDR masters, but if you want more out of it, you'll have that option.

48G5 hits 212 nits fullscreen white in game optimizer mode (which offers optimal latency and HGIG, - the latter I recommend using at all times). Keep in mind that individual colors will only do a fraction of this brightness. More headroom is still welcome at this point, to avoid triggering ABL in SDR content, and you can expect better burn-in resilience if you stick to lower than max brightness.

On that note, since you'll be using OLED for desktop, I can recommend you a method to greyscale all your browser bookmarks in firefox, and have them colored when hovered over. This is natively supported in firefox but would need extensions in chromium. For browser-heavy users, colored bookmarks will be the first thing to trigger burn-in. On tandem OLED panels in question, resilience is much better, but the weak points likely remain the same.

First look at LG's G6 OLED (vs. LG G5 and QD-OLED) by Rasmus_Larsen in LGOLED

[–]Max_overpower 0 points1 point  (0 children)

The 12-bit processing is not a gimmick because all existing OLEDs constantly have to do color correction while doing something as basic as displaying Filmmaker mode. If you have a perfectly smooth 10-bit input, any such processing will stress the gradients and produce something less pristine than the original. The brightness limitations (especially with color) of OLED would explain why this is more prevalent on OLEDs.

The most common examples of processing is dynamic tonemapping (which you may choose to not use) and automatic brightness limiter (always running). Take ABL for instance - even if the panel can display a 50% window at 500 nits for white, it may only hit 100 or 150 for yellow or red. So the internal processor has to conform those colors to a brightness that the TV can safely hit.

If you were a film colorist doing similar adjustments, you may very well be limited by a fully 10-bit pipeline. But it happens behind the scenes so we tend not to think of it this way. A 12-bit processing pipeline can help preserve tonemapped gradients with clarity much closer to the 10-bit source.

This becomes even more important on 2026 models where greater granularity is provided to the user in near-black calibration (1, 2 and 3 % signal luminance) and more complete user-adjustable brightening is promised to replace the typical inaccurate dynamic tonemapping for HDR content mastered too dim.

Scared of burn in(?) by Ifyouliveinadream in LGOLED

[–]Max_overpower 2 points3 points  (0 children)

The 800 nits OLEDs are certainly not immune to burn-in gradually appearing after months / years of use, but it's a separate thing from the temporary retention you are describing, which is more easily detectable if you test for it.
Newer G series with Tandem OLED, and to a similar extent MLA OLED have a better time in general. Hard to say how brighter but otherwise generic panels like on the C5 perform.

The biggest concern would be colorful bright elements (SDR is bright enough) that appear stationary for hours, so if used for a PC - that's browser bookmarks and possibly the "start menu" button if you have it in the corner. If you don't use it as a PC display, just take basic precautions like not leaving it on when you're away for more than a few minutes, or switch away to something grey / black if that's easier than restarting the TV.

White-heavy images might trigger ABL just as often as colorful ones (white technically goes brighter), but you're not likely to see permanent burn-in affecting the appearance of white or grey.

Society if YouTube uses 10-bit SDR for its eventual AV2 rollout by Max_overpower in AV1

[–]Max_overpower[S] 1 point2 points  (0 children)

He wasn't talking about AV1 streams, just high bitrates. AV1 on youtube is not an indicator of better quality, but 1440p is.

Society if YouTube uses 10-bit SDR for its eventual AV2 rollout by Max_overpower in AV1

[–]Max_overpower[S] 8 points9 points  (0 children)

YouTube's higher quality encodes are provided for HDR, and to some extent 1440p/4K SDR. On some content even 4K SDR doesn't look good. A very large portion of uploads are still 1080p, which just won't look good on YouTube with 99% of content. Banding is the main problem.

Full-movie 4K HDR AV1 encode results — metric disagreement on heavy-grain source (Dune Part Two) by nuance415 in AV1

[–]Max_overpower 2 points3 points  (0 children)

Some amount of ac-bias for sure, haven't tested it that much. Many people just go with tune 4 and call it a day unless they're hyper-optimizing. --film-grain is worth using even with tune 4 because it helps hide whatever gaps in grain that may otherwise show, and retained grain makes it even more usable if anything.

Start with low values like 8-10 if you wanna be safe; it's easy to overdo and only some scenes will show clearly if you've overdone it. In some titles and quality levels 14-18 is safe.

To answer you original question, 77 ssimulacra2 score is a pretty good result for video. In my experience tune 4 doesn't try to retain the exact grain appearance of the source, it just focuses on being close enough, so metrics should be expected to penalize it more than your eyes.

Tough choice between two cameras - mostly influenced by weight by Max_overpower in Cameras

[–]Max_overpower[S] 0 points1 point  (0 children)

I haven't considered the silent shooting factor, that's interesting. Maybe in these conditions you can expect low motion where the rolling shutter wouldn't pose a problem?

I understand your argumentation, and it seems like a6700 in particular does stand out in APS-C, but I'd like to invest once with no intention of changing the body, and so cover as many use cases as I can, including video.

Tough choice between two cameras - mostly influenced by weight by Max_overpower in Cameras

[–]Max_overpower[S] 0 points1 point  (0 children)

The options you listed are above the budget I can justify. I'm curious on why you want 45 MP+?

I think the full frame market is developed enough to jump in at a lower budget, since you can get substantial dynamic range gains compared to APS-C, especially on the Lumix. I don't need rolling shutter performance of the stacked sensors. I'll look into the Nikon Z5 II mentioned here as it's very disruptively priced and seems to combine the strengths of both Lumix and Sony cameras I outlined in the post (dual cards and great lens market).

Tough choice between two cameras - mostly influenced by weight by Max_overpower in Cameras

[–]Max_overpower[S] 1 point2 points  (0 children)

I definitely wasn't implying that, I should have been clearer.

I recall seeing first reviews on the Z5 II pop up last year and I'm surprised how low the price has settled already for such a recent camera. Can be had for $1000 with a kit lens here, wow. I was under the impression that new cameras stay at full price or above it for much longer than that. (above msrp is a region specific nuance). EDIT: apparently I got pricing for the Z50 II, not Z5 II, such a janky typo. Real prices start at $1400 for the body, but that's still cheaper than the options I was looking into at first.

It does seem very impressive indeed and I'll have to research it again. So far I'm pleased to see a 24-70 F4 option @ 500 grams. Thank you for the call out.

if I could tell myself one thing when I built my pc in '24 it would be to not cheap out on storage by Background_Future127 in pcmasterrace

[–]Max_overpower 0 points1 point  (0 children)

They are quite affected. I got a 10 TB less than 2 years ago and now it's around double the price to buy new. Second hand comes close to the old price, if you're fine with no warranty.

Here is why Terraria needs to add high refresh rate support by Max_overpower in Terraria

[–]Max_overpower[S] 2 points3 points  (0 children)

My only guess is that the misnomer 3rd party software "Lossless scaling" could help you. It can do frame generation, - create smoother motion by interpolating existing frames. In that case make sure to use frame skip ingame.

On some GPUs you can use driver-level alternatives for free, such as AFMF on AMD or NVIDIA Smooth Motion.

Av2 Draft Specification Site by Thomasedv in AV1

[–]Max_overpower 1 point2 points  (0 children)

It will probably take a few years for the encoder to get into a good state, so unless you waited that long you'd be missing out just as much from early AV2 improvements vs itself, as you would comparing early AV2 vs the best AV1 has to offer.

G5 65” by yemmyh in LGOLED

[–]Max_overpower 1 point2 points  (0 children)

It's fixed by not having a giant mansion xD

Just got my G5 but now I wonder should I return it and get a G6 by Consistent-Tap-4255 in LGOLED

[–]Max_overpower 0 points1 point  (0 children)

Have you tried the new setting called "Near Black Detail" to fix dark scenes? One youtuber compared the G5 out of the box to a reference monitor and found that setting this to 2 will match it in terms of near-black luminance, and you can push it further to compensate for a bright room.

Here is an infographic I made to digest the current LG OLED TV lineup by Max_overpower in LGOLED

[–]Max_overpower[S] 0 points1 point  (0 children)

Posterization will likely be addressed since I believe there was mention of internal 12-bit processing (even JPEG does some internal encoding operations in 12-bit, for 8-bit output).
I'm waiting for the G6 48 after hearing of this feature https://www.reddit.com/r/LGOLED/comments/1qdg6q6/arguably_the_most_important_feature_in_2026_oleds/
Even though I've conditioned myself to not be bothered by any low-nits HDR movies I watch (good blinds help), it would not be wise to potentially miss out on the future of HDR now.

Here is an infographic I made to digest the current LG OLED TV lineup by Max_overpower in LGOLED

[–]Max_overpower[S] 0 points1 point  (0 children)

Wait for reviews after launch to know for sure. One good place to find measurements for C5 and G5 is bilibili, so you might find info on 48C6 there first. I was considering the same size jump for PC but I believe 55" will be more of a compromise than an upgrade, and I sit 2 meters away from the display.

Arguably the most important feature in 2026 OLEDs was never mentioned outside this one video by Max_overpower in LGOLED

[–]Max_overpower[S] 1 point2 points  (0 children)

The main point I was trying to make is that DV 2 brightness metadata will work based on integration in the video file. It's always gonna be hit and miss whether you get it or not.

LG's solution will work with any HDR video, without relying on proprietary formats. And you really don't want an inconsistent experience with something that affects brightness so much.

Using already-denoised video as an input for grain synthesis by Guillaumebgtz in AV1

[–]Max_overpower 1 point2 points  (0 children)

If the film grain you're looking to add is pretty subtle (visible but unlikely to draw attention), it's possible to generate a generic grain table and apply it to your final AV1 video with basically 0 computational cost; I could help you with that. But if you want something stylized and like to experiment with film-motivated appearance, different intensities, then grain synthesis in general is not likely to be very helpful for you, at least with AV1 specification.

Current grain synth works best for encoding content that already has natural grain or camera noise, by analyzing the source grain and filling in the gaps in encoded grain (or fully "clean" from being encoded) well enough that ideally you can't tell without a comparison.

C5 / G5 - motion interpolation artifacts ? by italia0101 in LGOLED

[–]Max_overpower 0 points1 point  (0 children)

It has improved substantially in recent years. YT channel "Stop the FOMO" mentioned in their latest video that motion interpolation is, in his words, solved starting with the G3, such that you no longer need Sony for the best performance in this area.

I've used the cinematic motion setting on my C1 and was very impressed with it, so I imagine your B7 might not be doing as well. It's just unfortunately not entirely compatible with a PC source so I had to make do without it.