View Single Post
Old 21st March 2018, 19:31   #49680  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
Thanks Warner for the further details...
I'm sorry guys. I just can't get my head wrapped around all of this. Here's what I'm struggling to understand:

Installed new driver. RGB 4:4:4 and set it to my native 2160p 8bit 60Hz. Then I switched to 2160p 12bit 23Hz and 24Hz. Then set back to 2160p 8bit 60Hz. Next I played a 2160p 23Hz HDR 10bit title no FSE. Looked in NCP during playback and it is showing 8bit at 23Hz as if it ignored my previous command to play 23Hz at 12bit. My display does not show detailed info so I check info from my Denon AVR. It shows RGB 4:4:4 8bit. To me, I don't think this is correct and why I ask you guys. So, during playback I select 12bit in the NCP. I go back to info from AVR and it shows RGB 4:4:4 12bit now. I know title is 10bit so AVR info means nothing I guess? True? Either does bit set depth setting in NCP? True? And madVR does not report anything beyond what the GPU is sending it? True? So how do I know if my display is outputting 8bit or taking advantage of the higher 10bit depth of an HDR title? Sorry I am so naïve!

To make understanding more difficult, after reboot that 12bit setting no longer appears in NCP or my AVR even though I manually changed during playback before I rebooted. It's back to 8bit as if I never set it.
I'm not technical enough to answer all of your questions, but I can start. The first scenario where your AVR is reporting 8-bit sounds like a driver error if you selected 12-bit in the NCP. This would be confirmed by the fact you were able to correct this during playback by changing the bit depth in the NCP. Did this change stick?

Second, you are not taking advantage of the 10-bits of the source. It could be output at 8-bits with dithering without most users noticining much of a difference. The color space is not clipped. It is all about smoothing gradients, and high-quality dithering makes various bit depths look smooth. But, of course, you want 10-bit output if your display can support this. Just remember, madVR is processing everything at very high bit depths (16-bits); higher than the highest output bit depth (10-bits). Errors will not occur when going to any bit depth below madVR's processing.

As far as the GPU output is concerned, I don't know what Nvidia sends to display. I thought it passed-through 10-bit, but it might actually be upconverted to 12-bits. That is beyond my technical acumen.
Warner306 is offline   Reply With Quote