View Single Post
Old 22nd March 2018, 18:47   #49734  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post

When using YCbCr 4:2:2 10/12bit @23Hz there is NO banding.
When using YCbCr 4:4:4 10/12bit @23Hz there IS banding.
When using RGB 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 8bit @23Hz there is NO banding.

So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz
or
use YCbCr 4:2:2 10/12bit @23Hz and gain back 10bit once the 12bit dithers down or when using 10bit? I don't mind recalibrating since YCbCr is limited only and no full.

Lastly, Allied is a 2160p HDR title. Testing an SDR 1080p title, no amount of anything eliminates or reduces banding. A good example is 47 Meters Down 2017 at scene 42:00
Bit depths were covered yesterday.

Remember, 10-bit RGB > 8-bit RGB > 10-bit YCbCr 4:2:2 > 10-bit YCbCr 4:2:0. That is a quote from madshi.

Given the problem your display has with 12-bit HDR, 8-bit and 10-bit are not equal, as 8-bit is actually better. So your choice is easy. Going from RGB in madVR to YCbCr 4:2:2 would destroy the chroma upscaling done by madVR.

Given that 4:2:2 corrects the HDR issue, maybe your display has trouble with the higher bandwidth of a 12-bit 4:4:4 HDR signal as RGB and YCbCr seem not to matter. It sounds like a poorly-implemented HDR mode by the display. Perhaps, an epic failure by the display if banding is created. For some reason, for one user, Windows OS HDR does a better job. The output from Windows would still be 12-bits HDR, but it doesn't have banding. That sounds like poor tone mapping or gamut mapping.

Does the 1080p title only band at 10-bits? It is not uncommon for 1080p Blu-rays to have some banding in the source.

Last edited by Warner306; 22nd March 2018 at 18:52.
Warner306 is offline   Reply With Quote