View Single Post
Old 26th December 2016, 11:57   #41743  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by e-t172 View Post
A Blu-ray player will typically output YCbCr (directly from the decoded video stream) over HDMI. The TV does the YCbCr → RGB conversion.

Internally, a PC GPU works in RGB only. A PC will typically output RGB over HDMI. The TV just passes its through (hopefully) untouched.

If you enable YCbCr output in your GPU driver settings, the GPU will output YCbCr over HDMI. But internally, it still works in RGB. So what's going to happen is that the GPU will internally convert RGB to YCbCr, then send it over HDMI, then the TV will convert YCbCr back to RGB. The double conversion is pointless, and is likely to degrade quality (especially if there's chroma subsampling going on, or the PC and the TV disagree about which matrix to use).

Putting it all together, here's what happens when using madVR:

GPU driver configured to output RGB: LAV decoder output (YCbCr) → madVR (converts from YCbCr to RGB) → HDMI output (RGB) → TV (RGB)

GPU driver configured to output YCbCr: LAV decoder output (YCbCr) → madVR (converts from YCbCr to RGB) → HDMI output (converts from RGB to YCbCr) → TV (converts from YCbCr to RGB)

Hopefully you can see now that the second configuration doesn't make a ton of sense!
The bolded part above is true in theory (especially with PC monitors) but not correct most of the time in practice with TV/projectors.

If the display has color/hue controls, it has to convert the input signal to YcBcR to apply color/hue, then convert to RGB to display.

So if you send RGB, what happens usually with a TV/Projector is RGB > YCB (for color decoder controls) > RGB.

This means sending RGB results in one extra conversion [EDIT: on the TV side, not overall, see posts below].

The best way to check if your display converts to YCB before displaying in RGB is to try to change color/hue. If they are available/active, then the display converts to YCB even if you send RGB, before converting back to RGB.

This doesn't mean that RGB isn't better for MadVR (unless it causes levels issues), it's definitely what Madshi recommends for MadVR, just that it doesn't' necessarily create the "cleanest" path from a conversion point of view.


Quote:
Originally Posted by Oguignant View Post
Merry Crhistmas too!!!

X-tended Dynamic Range PRO increases the quality of HDR and even non-HDR content by revitalizing every scene with the widest range of brightness possible. It goes in tastes. I prefer not to use any image processing on the TV, it always makes it look artificial.

As for Color Space, you can leave it in Auto if it works correctly. If you use bt-2020 or DCI with encoded content for view bt-709 content, you will see some saturated colors.

The new video standard for Ultra High Definition content is BT.2020 (colors closer to reality)
The BT-2020 color standard lets you display about 3/4 of the visible color spectrum. The video standard for HD content (BT-709 Standard) only reproduces 30% of the spectrum reproduced by the new standard for 4k/UHD BT-2020.

DCI is the one that will be implemented the soonest, and it represents the basic color requirement of the HDR spec. To meet the minimum HDR requirements, a TV must be able to display over 90% of the DCI color space.

The main difference between DCI and bt-709 (the current standard color space) is that DCI can display many more tones of green, though there is also a slight expansion to the number of red tones. The number of blue tones was unchanged. Altogether, it covers just over half of the visual spectrum, and will provide a pretty significant increase in picture quality over bt-709, which covers only about 35% of the visual spectrum.

Hope this can help you!
This is incorrect too. DCI isn't and will never be used for consumer sources. It's used in cinema sources, and grading monitors are today still calibrated to DCI, but they use BT2020 as a container to distribute consumer content, so for consumers it's either rec-709 (bluray/HDTV) or BT2020 (UHD Bluray/UHDTV).

Content at the moment doesn't go further than DCI primaries, but it will progressively get closer to BT2020. Still, from a calibration point of view, DCI is irrelevant. You should never use a DCI mode with consumer content. If you do, the colors won't track accurately as even if the content was mastered on DCI monitors, it was encoded using BT2020 saturations.

For example, all current UHD Bluray titles report DCI primaries in the metadata, but that only tells the limitations of the grading monitor. It doesn't mean that the content was mastered using DCI primaries. All UHD Bluray titles are encoded using BT2020 primaries. The content doesn't reach the full gamut yet (like from a luminance point of view, the content doesn't reach the theoretical max of 10,000nits, but is closer to 1000-2500nits at most, even when graded at 4000nits).
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 26th December 2016 at 12:42.
Manni is offline   Reply With Quote