Thanks Warner for the further details...
I'm sorry guys. I just can't get my head wrapped around all of this. Here's what I'm struggling to understand:
Installed new driver. RGB 4:4:4 and set it to my native 2160p 8bit 60Hz. Then I switched to 2160p 12bit 23Hz and 24Hz. Then set back to 2160p 8bit 60Hz. Next I played a 2160p 23Hz HDR 10bit title no FSE. Looked in NCP during playback and it is showing 8bit at 23Hz as if it ignored my previous command to play 23Hz at 12bit. My display does not show detailed info so I check info from my Denon AVR. It shows RGB 4:4:4 8bit. To me, I don't think this is correct and why I ask you guys. So, during playback I select 12bit in the NCP. I go back to info from AVR and it shows RGB 4:4:4 12bit now. I know title is 10bit so AVR info means nothing I guess? True? Either does bit set depth setting in NCP? True? And madVR does not report anything beyond what the GPU is sending it? True? So how do I know if my display is outputting 8bit or taking advantage of the higher 10bit depth of an HDR title? Sorry I am so naïve!
To make understanding more difficult, after reboot that 12bit setting no longer appears in NCP or my AVR even though I manually changed during playback before I rebooted. It's back to 8bit as if I never set it.
Last edited by brazen1; 21st March 2018 at 19:20.
|