View Single Post
Old 22nd March 2018, 18:17   #49734  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by brazen1 View Post
I have a couple more things to add since some keen interest is popular at the moment: I have no intentions of introducing any sort of Ad nauseam.

Early drivers offered 8,10, and 12bit. The 10bit option was removed in recent drivers afaik. So now it's just 8bit or 12bit to select.

When using RGB 4:4:4 12bit @23Hz (matching old faithful Allied 2160p HDR test refresh rate) understanding this is a one shot deal since it's going to revert to 8bit after a reboot with newer drivers:

Manni has pointed out his display accepts 12bit and dithers to 10bit correctly with no banding and retains after a reboot. If I'm not mistaken, Manni is also using a custom 3Dlut and not passing HDR through.

It should also be noted, no amount of madVR processing to reduce banding artifacts has any affect at all. Nor does the dithering algorithm used.

When using YCbCr 4:2:2 12bit @23Hz there is NO banding.
When using YCbCr 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 8bit @23Hz there is NO banding.

So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz
or
use YCbCr 4:2:2 12bit @23Hz and gain back 10bit once the 12bit dithers down? I don't mind recalibrating since YCbCr is limited only and no RGB.
Nope, this is not what is happening here.

I use MadVR in HDR passthrough mode, so MadVR sends 10bits, the GPU sends 12 (padded or "interpolated") and the display does no additional dithering because it's a 12bits native projector (12bits path from inputs to panels). So 12bits is actually the best option for me, as my cables are all able to handle the full HDMI 2.0 bandwidth and my display handles 12bits natively.

I do not use a 3DLUT for HDR, I pass it through to the display and I use custom ST2084 gamma curves to display it properly (the native HDR mode of my model is very poor, so I use the Vertex to disable it and I handle the HDR conversion with my own curves). The Metadata goes to the Vertex (and is displayed on its OSD for monitoring/testing when necessary) but it's not sent on in order to prevent the JVC from switching to its crappy HDR mode automatically. The PJ is calibrated to HDR BT2020 when I display 4K23p HDR content. The native HDR mode on more recent JVCs is better than on my model, but still not as good as a few well-designed custom curves (until MadVR's HDR to SDR conversion works better with projectors).

In my case, the best mode for HDR (or SDR) content at 23p is 4K23p RGB 4:4:4 12bits (MadVR dithering to 10bits). For 60p content, it's 4K60p RGB 4:4:4 8bits (MadVR dithering 8bits). For others, it might be different.

I wouldn't use 4:2:2 unless I had to (poor cables, non-optimal display). I'd rather have MadVR chroma upscaling goodness all the way, but that means cables able to handle 445Mhz in 4K23p@12bits and 600Mhz in 4K60p@8bits.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 22nd March 2018 at 18:50.
Manni is offline   Reply With Quote