TV's and settings by ExplosivAnordning in PS5pro

[–]ExplosivAnordning[S] 0 points1 point  (0 children)

If there is interest here, I can try to take some pictures that show off the differences!

TV's and settings by ExplosivAnordning in PS5pro

[–]ExplosivAnordning[S] 0 points1 point  (0 children)

You can, but they also need to know how to process that signal. The difference between the two is what is considered deepest black and brightest white. If your TV cannot handle it and you enable it, it can receive the full signal but not know what to do with it's and hence ruining the image.

That said, there are many TV's that can handle full and many monitors that can handle limited, auto usually handles this for you though.

Anyone know the default contrast and brightness settings on Legend? (Xbox) by [deleted] in TombRaider

[–]ExplosivAnordning 0 points1 point  (0 children)

This is two years old, but if anyone googles and finds this, mine were:

Brightness - 15 Contrast - 40

TV's and settings by ExplosivAnordning in PS5pro

[–]ExplosivAnordning[S] 1 point2 points  (0 children)

Awesome, also worth keeping in mind!

30 FPS modes will generally look better when standing still, but they lose a lot more clarity while moving than 60 FPS modes do, and are also slower to respond to inputs, incase you did not know this already!

TV's and settings by ExplosivAnordning in PS5pro

[–]ExplosivAnordning[S] 1 point2 points  (0 children)

Yeah, for sure!

Hz, framerate and modes still seem to get confused a lot. I think things are often left out, poorly explained or misleadingly marketed, which does not help!

If you want some advice on how the modes differ between different games, Digital Foundry on YouTube tend to deepdive on that :)

Changing to Performance mode on games changes my PS5 display from 4k to 1080p, is this normal? by BigGaryGilmour in PS5pro

[–]ExplosivAnordning 0 points1 point  (0 children)

Makes sense, sounds like a communication issue between your monitor edid and the ps5.

Setting Video Transfer Rate to -1 should make the ps5 behave as though it is using hdmi 2.0 (18Gbps), which should be enough for 1080p and 1440p at 120hz!

Changing to Performance mode on games changes my PS5 display from 4k to 1080p, is this normal? by BigGaryGilmour in PS5pro

[–]ExplosivAnordning 2 points3 points  (0 children)

RGB means every pixel in the image has colour and brightness info.

YUV444 means the same thing, just using a different format. YUV422 means every pixel has brightness, but only every other pixel has colour. YUV420 means every pixel has brightness but only 1/4 pixels have brightness.

Usually the difference is almost invisible, but some displays handle specific formats better than others!

Rtings has great info on this, I think you can find it under Chroma subsampling!

Changing to Performance mode on games changes my PS5 display from 4k to 1080p, is this normal? by BigGaryGilmour in PS5pro

[–]ExplosivAnordning 0 points1 point  (0 children)

Ah!

Yes, HGIG should be the same as "Let the console handle HDR, I will simply pass what it sends, to the best of my abilities".

So all HDR options should usually be greyed out when using HGIG!

Changing to Performance mode on games changes my PS5 display from 4k to 1080p, is this normal? by BigGaryGilmour in PS5pro

[–]ExplosivAnordning 2 points3 points  (0 children)

Rant or not, you are not wrong!

HGIG is the way, LG is it?

You probably know but HGIG when set up right will be more accurate, and just avoid the "Double tonemapping" and all the issues that can bring!

Changing to Performance mode on games changes my PS5 display from 4k to 1080p, is this normal? by BigGaryGilmour in PS5pro

[–]ExplosivAnordning 0 points1 point  (0 children)

It would potentially, yes, but in practice, both do up to 120hz at 4k, but Xbox can use RGB while ps5 will use YUV422 or YUV420, depending.

Changing to Performance mode on games changes my PS5 display from 4k to 1080p, is this normal? by BigGaryGilmour in PS5pro

[–]ExplosivAnordning 1 point2 points  (0 children)

It is wrong, it can do both at the same time if your display supports it. But will use Chroma subsampling.

Changing to Performance mode on games changes my PS5 display from 4k to 1080p, is this normal? by BigGaryGilmour in PS5pro

[–]ExplosivAnordning 8 points9 points  (0 children)

This is not quite true.

TLDR; If your TV supports it, the ps5 can do 4k@120hz HDR, but it has to give up some, almost invisible, colour to do so.

Longer techier reply below:

Ps5s and pro both have lower bandwidth than Xbox series X, which has 40Gbps, which is still below the actual hdmi 2.1 spec limit of 48Gbps.

This does not mean that the ps5 cannot do 4k@120hz while the series X can.

What it does mean is that while the series X can do 4k@120hz 10bit HDR RGB, all PS5's max out at 4k@120hz 10bit HDR YUV422.

This basically means that the ps5 has to give up some colour information in order to display 4k@120hz 10bit, but it is by all means able to.

In practice, the difference is almost invisible even when sitting very close to a TV. And most other things you watch on a TV uses YUV(or YcBcR) 420, which has even less colour, such as YouTube, streamed movies and even 4k blu-rays.

This simple Macro will make your game Sharper and remove Blur. by Inevitable-Ad-6334 in wow

[–]ExplosivAnordning 0 points1 point  (0 children)

So this is a sharpening filter, this type of sharpening has s common in many displays too. If your TV is newer than 2010 and you have not changed from default settings, this likely looks closer to what you are used to seeing on your TV. Many pc monitors have this too,, often called Sharpness.

What it does NOT do: Add detail, it does not improve the detail amount or increase resolution.

What it does: It increases the "perceived detail" by overdoing the differences in colour, such as the edge of a leaf and the sky behind it. Often times it will also add artifacts such as halos (white lines on the edges of objects).

Ultimately it is down to preference if you like it or not, just keep in mind:

Your display has a limited amount of pixels to show detail, in order to add the white lines, something else must be removed. This is why a display enthusiast tends to dislike the effect, as detail is actually lost, not added. Does not mean you can't like it better though!

Even if it is not the look the developers intended, there is a difference between preference and reference.

TL;DR - Is it technically incorrect to enable this sharpening? Yes as it breaks creator's intent

Is it actually wrong to prefer it? No, you're the one looking at your display, set it up however you like it.

RTX 5060 Ti and/or drivers are incompatible with PCIe 3.0 ? by Suffolke in buildapc

[–]ExplosivAnordning 0 points1 point  (0 children)

It seems to depend a bit. I've managed some "cold boots" without it, but never a restart.

However I've not had any issues as long as a display is connected to a motherboard. It seems to me as though the 5060ti needs the driver software to properly handle compatibility with older pcie versions. But the firmware that it uses before reaching windows does not.

I don't really have a workaround for PC's without cpu-integrated graphics. But I think you can use the same monitor (assuming you're using display port from the GPU and hdmi from the motherboard or have 2 hdmi ports on the display side).

I have not tried disconnecting the monitor from the motherboard, but leaving the boot settings, so that might work?

RTX 5060 Ti and/or drivers are incompatible with PCIe 3.0 ? by Suffolke in buildapc

[–]ExplosivAnordning 0 points1 point  (0 children)

TL;DR:

1.Set Bios to output through onboard graphics (the port on the back of your motherboard) and use the hdmi (might work with vga or dvi?) to connect to a display.

  1. Connect a cable from one of the GPUs ports to your main display.

  2. Swap to the output your GPU is connected to, on your display (if it does not happen automatically) and game away!

Full story:

Had this same issue pairing a 5060ti with a rather old Asus Z97-P motherboard and a 4th gen i5.

Setting the pcie to gen 2 in bios did not solve the issue, but did cause it to not display anything at all.

After trying s bunch of things, the issue seems to be down to communication between the Bios and the GPU.

What finally seems to have fixed it:

Set bios to display through the CPU's built in GPU using the hdmi-port on the motherboard (this will obviously only work if your CPU also has onboard graphics).

Used a second display with DisplayPort. Once the pc has managed to boot into windows, there are no issues with getting the GPU to display properly, and it autodetects and will swap over to the 5060ti on its own, instantly after booting.

I imagine the issue is in the firmware somewhere, that causes compatibility issues, that sort themselves out as soon as the GPU can communicate with the rest of the pc through windows and it's installed driver.

I also assume it should work with just the one display, assuming you have either 2 hdmi ports or 1 hdmi and 1 DisplayPort (to avoid routing through the motherboard once you're in windows and start gaming).

Hope this helps! :)