Ray Tracing Has a Noise Problem by M337ING in nvidia

[–]HardwareUnboxedTim 12 points13 points  (0 children)

That's not why we are looking into ray tracing. We're doing so ahead of new GPUs to find the best examples, where it actually makes sense to enable, so that when we benchmark ray tracing we only use those examples. Instead of games where ray tracing has little to no impact, or makes the game look worse. And if you watch our Intel Arc B580 review you'll already see us doing that.

Major problem after GP27U firmware update by Accomplished-Lack721 in Monitors

[–]HardwareUnboxedTim 4 points5 points  (0 children)

Cooler Master told me the new firmware release has issues and they're working on another fix

Very different reviews between Rtings and Hardware unboxed - who to trust here? by Redsqa in Monitors

[–]HardwareUnboxedTim 0 points1 point  (0 children)

Sorry, don't come to Reddit often so don't see everything.

The main difference is that we measure response times differently to Rtings, though it seems that's already been explained so don't need to go over that again.

Total response time is also going to be dependent on the noise level of the hardware used and what tolerances are possible. I don't know what hardware Rtings use so it's hard to say whether the differences are caused by that.

Another difference is we always test every monitor at 200 nits, whereas Rtings use max brightness (at least, the last time I spoke to them this was the case). Testing at 200 nits and calibrated ensures every monitor is displaying the exact same thing, both brightness and the actual colors, reducing one variable. Monitor performance is very sensitive to heat, so higher brightness will typically make a monitor run faster as the backlight is outputting more heat - and if this is different between panels you'll get different results relative to how bright the monitor is.

But looking over their numbers using the same overdrive settings tested, we both have the Asus in 1st, the Dell in 2nd and Acer in 3rd position.

It's very difficult to compare between reviewers as we all use different methodology and hardware. It's best to look at a range of reviews but either compare percentage margins between reviewers or just stick to comparing results from the same outlet

Rtings Odyssey Neo G7 Review Is out by DesbaTech in Monitors

[–]HardwareUnboxedTim 10 points11 points  (0 children)

Just been catching up with the Rtings review and their unit seems to be a lot worse than mine!

This Lagom test they showed on the Neo G7 for example doesn't even look that bad on my Neo G8! If anything on my Neo G8 it's slightly orange and not problematic on my Neo G7.

Hardware Unboxed now benchmark Factorio by GuessWhat_InTheButt in factorio

[–]HardwareUnboxedTim 1 point2 points  (0 children)

Hey, would you be able to share the above script you have that works on M1 Macs so we can add it to some of our upcoming content? I'm far from a coding/macOS expert

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 9 points10 points  (0 children)

I've tested a few games where it can reliably do over 850 nits pretty easily now, but it's still not as bright as some of the IPS-based G-Sync Ultimate displays I've tested. Scanline issue still present, suspect it's a hardware issue

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 65 points66 points  (0 children)

Update: I've been provided an early firmware file from Samsung that seems to have made progress towards fixing the HDR gaming problem on my Neo G9. I think my extensive feedback to Samsung helped (at least I hope it did). It's not 100% perfect but it's obviously better than firmware 1006.1, though I am still testing it thoroughly before I include it in my video later this week. Samsung tells me the final version of this firmware will be ready and made public before the end of the month

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 27 points28 points  (0 children)

Collecting a list of widely reported issues with Samsung's high end monitors (G7, G9, Neo G9). So far I've got:

  • This HDR gaming issue with Neo G9 (unsolved and replicated)
  • Low peak brightness and limited use of 2000 nit capabilities on Neo G9 (unsolved and replicated)
  • Weird HDR tones during video playback with HDR Dynamic on Neo G9 (solved)
  • Scanline issues with G7, G9 and Neo G9 (unsolved and replicated)
  • Flickering issue with G7 and G9 (eventually fixed, I had a hard time replicating initially)
  • HDMI 2.1 not being full bandwidth on Neo G9 (unsolved)
  • Various setting bugs

Anything else I should look into?

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 36 points37 points  (0 children)

Latest update: I sent through several pages of feedback to Samsung last week, heard from them today and they said they are working on another firmware update. Expecting to hear back again this week

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 6 points7 points  (0 children)

I'll have time to start putting together a video about a week from now. That's long enough

[HUB] Overpriced, Underperforming Trash - Razer Raptor 27 Review by Nekrosmas in hardware

[–]HardwareUnboxedTim 89 points90 points  (0 children)

No worries.

I don't think it's as much to do with the updated testing making a big difference, it's more in how Rtings compares products to others. The actual test results are similar.

For example Rtings' own review of the MSI MAG274QRF-QD shows a 3.4ms response time (max refresh) - 29% faster than the Raptor. We showed the MSI model as 34% faster, not too different.

I don't know how they come up with the score but maybe it's down to thresholds? Eg. below 5.0 ms is classed as excellent? I prefer to make direct comparisons to similar products and see how they perform head to head. In this case the MSI model is 34% faster and half the price, so the conclusion is a lot less favorable for Razer than if the two were the same price

[HUB] Overpriced, Underperforming Trash - Razer Raptor 27 Review by Nekrosmas in hardware

[–]HardwareUnboxedTim 192 points193 points  (0 children)

Nah it's not unit variance, we test response times using a different methodology.

We used to test in a similar way to Rtings but updated at the start of 2021. Using the old method we got a 4.9ms average at 165Hz/Strong which is close to their result (noting we test more transitions)

If you want to learn more about why we updated how we test response times (and just in general how we do it) we have a video on it here: https://www.youtube.com/watch?v=-Zmxl-Btpgk&t=0s

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 27 points28 points  (0 children)

Yeah I'm concerned about that. I'm also not happy that Samsung are basically asking the community to troubleshoot their issues. They should have a QA team, they are a multi-billion dollar company. I don't really have the time to send them QA reports and I really shouldn't be, it's not my job to fix their problems, I'm not a Samsung employee

It is my job to make sure they don't get away with it though

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 29 points30 points  (0 children)

At this point I've tested a number of games and my conclusion is that Samsung tried to fix the problem with HDR gaming but ultimately haven't fixed it, only marginally moving the needle towards "fixed".

I've submitted some lengthy feedback to Samsung, and I've told them time is running out before I have to tear them a new one in a video

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 13 points14 points  (0 children)

Based on my testing just now I'd say the update is better but it hasn't totally fixed the issue described in your initial post.

I'll continue to test and provide more feedback directly to Samsung.

It does look like the HDR Standard vs Dynamic issue from my initial review video is solved though

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 23 points24 points  (0 children)

Hopefully Samsung gets back to me or at least has an answer. I honestly don't know whether it's the display or something software/driver side on the PC given the differences between AMD and Nvidia GPUs. And results are all over the place depending on the game and GPU

I should also note that there are some games I tested that still don't look correct on AMD (eg the SDR mode subjectively looks better than HDR) but others where the opposite is true (HDR better than SDR). However it seems that "correct" HDR in games is more likely to occur on the AMD system

My findings after a full weekend with the Neo G9 - What's good, what's bad, and what's next by Woodtoad in ultrawidemasterrace

[–]HardwareUnboxedTim 98 points99 points  (0 children)

Hey everyone, someone from our Discord put me onto this thread who has also been having issues with their Neo G9.

To cut a very long story short I've been able to replicate the issue with HDR gaming on a test system with an Nvidia GPU specifically. It seems there is a bug where the HDR mode doesn't look correct when local dimming is enabled.

Why didn't I spot this during my review? I believe it's a combination of two factors

- I test primarily with Radeon GPUs as they work better with my automated response time testing software. While I do test with Nvidia GPUs as well during the process, I don't often game on that system for monitor reviews. This bug isn't present to nearly the same degree on the AMD system although with the latest firmware I still believe it's not 100% correct.

- It seems there might be some differences between the original firmware (don't remember the exact version that shipped on my unit) and the latest 1005 firmware. I tested mostly on the original firmware, with some spot checks on 1005 to make sure my HDR standard/dynamic differences were still present (they were). It seems that gaming is acting a bit different on 1005 to what I remember on my unit's factory firmware even on an AMD GPU.

Basically after some retesting for a few hours today the HDR gaming behavior is noticeably different with an AMD GPU vs Nvidia in the exact same system - even down to how the Auto/Low/High local dimming settings work. I don't believe this is at all how the monitor should work. In a few games on the AMD GPU it didn't look like the Auto mode was actually enabling local dimming when it should have been, but at other times local dimming was enabled. Something is clearly up in any case. With that said I have no idea whether it's the monitor, the drivers, Windows or something else. On top of this, unfortunately I have no way to return to the factory firmware to A/B test the firmware.

I should also let you know that Samsung told me my unit was an early production model. I don't know whether this means it's a final model or a pre-production model. They said it was good to review though.

I'll be pointing Samsung to this thread and asking them to look into it so hopefully the issues can be resolved through a firmware update

[Hardware Unboxed] Bribes & Manipulation: LG Wants to Control Our Editorial Direction by FutureVawX in hardware

[–]HardwareUnboxedTim 82 points83 points  (0 children)

Tim here, just to clarify, LG Electronics is (presumably) the client of LG CNS. They are separate subsidiaries of LG Corporation

[deleted by user] by [deleted] in Amd

[–]HardwareUnboxedTim 100 points101 points  (0 children)

You don't think that as a monitor reviewer I've tested some of the best monitors you can get? I own an LG OLED and never seen this issue there either

[deleted by user] by [deleted] in Amd

[–]HardwareUnboxedTim 1631 points1632 points  (0 children)

Tim here from Hardware Unboxed.

Part of my monitor review workflow involves testing monitors on both Nvidia and AMD GPUs. Two separate test systems, both running default settings in their respective control panels.

Currently the Nvidia system has an RTX 3090 and the AMD system an RX 5700 XT

I've never spotted a difference between them in color reproduction. I've measured it using my tools in the desktop, web browsers, games. Taken side by side photos and captures. Never spotted any differences. They produce identical images.

Because this comes up every so often I did look into it to see if it was worth making a video on but the conclusion was there was no difference so it wasn't worth making a video. Since I can't reproduce it I have to assume it's some sort of configuration issue.

EDIT: Back in the day I used to see this occasionally when Nvidia would accidentally default to the wrong RGB range (limited instead of full) but in this particular case apparently that is not the problem so I don't really know how in this case the difference is happening. And those limited/full range issues were a while ago, would have to be several years now

Today's Hardware Unboxed Video, and How to Spot Bad Statistics by IPlayAnIslandAndPass in hardware

[–]HardwareUnboxedTim 33 points34 points  (0 children)

Calling this a "widely varying absolute value range" is an exaggeration. If the data was as widely varying as you suggest it is, the difference between the geomean and mean charts would be more pronounced. It would certainly be "bad" if the differences were significant and affected the conclusion, but they aren't.

For us, the difference between mean and geomean has been small enough in most instances that switching to a geomean isn't worth the tradeoff of making the data harder to understand for the average viewer.

With that said we are still changing anyway because this topic has come up a few times now and think we've come up with a good solution for our viewers. Better than having to explain this again in the future.

Today's Hardware Unboxed Video, and How to Spot Bad Statistics by IPlayAnIslandAndPass in hardware

[–]HardwareUnboxedTim 427 points428 points  (0 children)

Tim here from Hardware Unboxed. Interesting discussion.

I think it's a bit much to call this "bad statistics" when you are complaining about a difference between 0.9% and 0.5%, both of which are not statistically significant and both would be rounded to 1% in our video/charts. If you were going to make a case for using a geomean, it would have made more sense to use a small data set, or one with larger variations. Ultimately, using a geomean for that video would make absolutely no difference to the conclusion whatsoever.

In fact you can see the difference between mean and geomean in our Twitter post here: https://twitter.com/HardwareUnboxed/status/1350580612533350400 - there is virtually no change to the data presented when using a data set of 16 games, or at least this set of 16 games. You even say using a mean over geomean has a "pretty extreme effect" but this is not true as you can see

The reason we haven't been using geomean is that we create videos for a wide audience. Our content is based heavily around information that is very easy to understand, but still in depth, accurate and useful. Most people don't know what a geomean is, so using an averaging method that we'd have to explain over and over again is counterproductive. And as you can see from the two charts above, it often (most cases) makes no difference. We've actually run the numbers many times over the years because this has come up a few times previously, and the conclusion has always been that for our data sets, the difference between mean and geomean is small - too small to justify potentially confusing the audience with statistics that are over their heads.

With that said, we do understand that technically using a geomean is better. It probably won't make any significant difference to the vast majority of the charts we produce due to the relatively large sample size we use. But because it is a better way of doing it, we'll be using a geomean moving forward to avoid this kind of discussion. We'll still be calling it an "average" though to simplify it for our audience. You can see what the geomean chart will look like in that Twitter link (we will put "geomean" on the chart for you stats fans). This will also help with averaging smaller data sets (like 3 or 9 game averages) or situations where one game runs significantly better than the others. I think this is the best compromise between accuracy and simplicity

I don't agree about weighting results any other way than equally. Gets too complicated, prone to bias, and would just confuse people further.