View Single Post
Old 22nd March 2018, 15:41   #49767  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by BatKnight View Post
I believe that when playing a 2160p 10bit HDR 23.976fps video and having madVR set up to output 10 bit, that all is going the best way possible.
What I don't understand is why when sending the HDR metadata, that results on NV HDR, I get banding and when not sending the HDR metadata, and manually enabling the OS HDR, I get perfect image and no banding... Why is OS HDR behaving differently than NV HDR for the same settings?
I think the problem here is the metadata. When the OS HDR toggle is enabled, it is my understanding Windows sends the color gamut and transfer function to the display, but not the metadata. The gamut and transfer function is enough for the display to enter its HDR mode. Tone and gamut mapping are done at the PC level. In fact, in the next update, Windows is releasing a calibration tool to change how HDR looks on your display. This is why HDR is always on.

When the Nvidia private API is used, the metadata is passed to the display untouched. The display uses its own processing to complete the tone and gamut mapping, not Windows. This would imply your display has issues processing a 10-bit HDR input. It's tone and gamut mapping is not of the highest-quality. This would explain why banding does not show for Manni on his display.

I could be wrong, but maybe a display is not the best at handling HDR processing? If true, I should delete my posts from yesterday.

Last edited by Warner306; 22nd March 2018 at 15:44.
Warner306 is offline   Reply With Quote