Thread: Avisynth+
View Single Post
Old 17th April 2018, 05:56   #4039  |  Link
foxyshadis
ангел смерти
 
foxyshadis's Avatar
 
Join Date: Nov 2004
Location: Lost
Posts: 9,558
Quote:
Originally Posted by `Orum View Post
...but how can a decoder possibly differentiate between a 10-bit HDR clip and a 10-bit clip from the video standards? I seems to me like the standards conflict; e.g. a HDR clip would assume 1023 != 1020 but if decoded as a "standards compliant" clip the white point would be clippped to 1020.

Edit: Granted, none of this matters if you're only using 10-bit to reduce quant error, but it's interesting nonetheless.
The decoder is only producing those 941 or 1023 values, it has no input on what's done afterward. The metadata tells the YUV->RGB converter what formula to use and where to saturate. If the converter sees YUV 990,512,512 (assuming BT.709), it'll still output 255,255,255 or 1023,1023,1023. We call that a blown-out highlight.

HDR is different because now 990,512,512 can actually mean something useful; it's often converted to a floating point value that's scaled against the display's actual white point before being converted into raw RGB to display. 900 may convert to only 600 if a display can produce blinding enough whites. That's why it has to be signaled, there's no way to infer what white point was meant out of raw pixel values other than assuming. Some HDR schemes still saturate at limited range, and simply rescale everything within it; some place the rec.709 white point at the end of the TV range and everything outside of it is special. The former is more common.

Last edited by foxyshadis; 17th April 2018 at 05:59.
foxyshadis is offline