Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Reply
 
Thread Tools Search this Thread Display Modes
Old 3rd December 2019, 11:08   #1  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
How to analyze an HDR video for peak brightness level for the setting of metadata?

I wanna do an encode of a HDR video I made. I'd like to include the global peak brightness metric in the metadata, and maybe the average brightness too. How can I find the brightest pixel in a HDR video (PQ curve) and then calculate the brightness in nits from its value? And how could I go about calculating the average brightness?
TomArrow is offline   Reply With Quote
Old 4th December 2019, 02:28   #2  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
How did you master your HDR video?

Usually mastering tools output this info or can report on it.
Blue_MiSfit is offline   Reply With Quote
Old 4th December 2019, 12:57   #3  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
Well, I had a composition in After Effects in 32 bit floating point linear color space that had some clipped highlights. Basically came from a source with a high dynamic range already, I just adjusted exposure to look good in SDR and then decided to have the superbright details be HDR instead of tonemapping (curves or such) or clipping. Then exported into the Rec2100 PQ color space. Checked the result and the highlights weren't clipped, so it worked just fine. But ofc didn't tell me what metadata I need.
TomArrow is offline   Reply With Quote
Old 5th December 2019, 09:07   #4  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
I'm scratching my head a bit as to why you mastered content this way, but we can sidestep that.

I'll assume you want to encode your content into HDR10 using static metadata.

The key thing to remember here - the HDR10 metadata consists of a few things:

1 - Signaling the transfer function, differencing function / matrix, and primaries in the HEVC VUI (you'll likely use SMPTE ST 2084, BT 2020 nc, and BT 2020 respectively - since you say you used Rec.2100 / PQ to export from After Effects)

2 - Signaling the mastering display characteristics, both in terms of the CIE XYZ coordinates of the primaries / white point, and the min / max luminance of the display. This is also known as SMPTE ST 2086

Note that there's nothing here about the max light level of the content. There is an additional piece of metadata you can signal called Max Content Light Level (MaxCLL) which indicates the "hottest" pixel in the whole sequence, and Max Frame Average Light Level (MaxFALL) which indicates the brightest frame on average. However, these are optional fields and they do not need to be signaled. In fact, they have little to no effect in most cases.

So, the only real question is the ST 2086 metadata.

What type of HDR display did you master the content on? It seems like maybe you didn't master on an HDR display?

Last edited by Blue_MiSfit; 7th December 2019 at 02:30.
Blue_MiSfit is offline   Reply With Quote
Old 9th December 2019, 23:12   #5  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by TomArrow View Post
I wanna do an encode of a HDR video I made. I'd like to include the global peak brightness metric in the metadata, and maybe the average brightness too. How can I find the brightest pixel in a HDR video (PQ curve) and then calculate the brightness in nits from its value? And how could I go about calculating the average brightness?
Colorfront Transkoder can do this very trivially.

Sent from my SM-T837V using Tapatalk
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 27th December 2019, 09:26   #6  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
Quote:
Originally Posted by benwaggoner View Post
Colorfront Transkoder can do this very trivially.

Sent from my SM-T837V using Tapatalk
Thanks but that looks like an expensive software suit, I just need a trivial analysis.
TomArrow is offline   Reply With Quote
Old 27th December 2019, 09:31   #7  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
Quote:
Originally Posted by Blue_MiSfit View Post
I'm scratching my head a bit as to why you mastered content this way, but we can sidestep that.

I'll assume you want to encode your content into HDR10 using static metadata.

The key thing to remember here - the HDR10 metadata consists of a few things:

1 - Signaling the transfer function, differencing function / matrix, and primaries in the HEVC VUI (you'll likely use SMPTE ST 2084, BT 2020 nc, and BT 2020 respectively - since you say you used Rec.2100 / PQ to export from After Effects)

2 - Signaling the mastering display characteristics, both in terms of the CIE XYZ coordinates of the primaries / white point, and the min / max luminance of the display. This is also known as SMPTE ST 2086

Note that there's nothing here about the max light level of the content. There is an additional piece of metadata you can signal called Max Content Light Level (MaxCLL) which indicates the "hottest" pixel in the whole sequence, and Max Frame Average Light Level (MaxFALL) which indicates the brightest frame on average. However, these are optional fields and they do not need to be signaled. In fact, they have little to no effect in most cases.

So, the only real question is the ST 2086 metadata.

What type of HDR display did you master the content on? It seems like maybe you didn't master on an HDR display?
Well, I did it that way because that's the only way I know how to do it. And it's a nice comfortable and logical workflow.

Hmm that's interesting, I thought MaxCLL and MaxFALL are the required ones. Honestly, I'd prefer just using those. I'm mastering on an SDR display. Entering that SDR displays data will hardly lead to anything reasonable I think, I do want to display the superbright values, not have them cut off.

I'll have to do some digging to understand this mastering display thing then I think.

Edit: Is there something like "ideal" mastering display data that I can enter? I realize it might compromise the quality a little bit, but the only other option is for me to simply "steal" the data of some other display, which seems pointless.

For max luminance of the display I figure I could just use my maximum light level (MaxCLL), but what about the primaries? What do they do?

Last edited by TomArrow; 27th December 2019 at 09:36.
TomArrow is offline   Reply With Quote
Old 28th December 2019, 02:23   #8  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
I'm not sure what to enter when you master HDR content on an SDR display. That's generally not done since you don't really know what you're looking at.

You could just try using some arbitrary values - it's typical in Hollywood to master on a 1000 nit display using the P3 D65 color space. You could express this in x265 using this:

https://x265.readthedocs.io/en/defau...master-display

The above example is for a display with a minimum light level of .0001 nits - in practice most content is not mastered that low. .002 is more common.

This may or may not look good - especially if the content looks good on your display in SDR without any LUT applied to map it.

You could also just skip the master-display param - the result would not be HDR10 compliant but may play correctly on some players.
Blue_MiSfit is offline   Reply With Quote
Old 29th December 2019, 04:45   #9  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
Quote:
Originally Posted by Blue_MiSfit View Post
I'm not sure what to enter when you master HDR content on an SDR display. That's generally not done since you don't really know what you're looking at.

You could just try using some arbitrary values - it's typical in Hollywood to master on a 1000 nit display using the P3 D65 color space. You could express this in x265 using this:

https://x265.readthedocs.io/en/defau...master-display

The above example is for a display with a minimum light level of .0001 nits - in practice most content is not mastered that low. .002 is more common.

This may or may not look good - especially if the content looks good on your display in SDR without any LUT applied to map it.

You could also just skip the master-display param - the result would not be HDR10 compliant but may play correctly on some players.
Thanks, looks good! I think I'll try using the primaries of my own display (closer to sRGB) and the luminance values of that display you suggested, or something like that.
TomArrow is offline   Reply With Quote
Old 29th December 2019, 11:34   #10  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
I ended up taking the code of the Average filter and committing horrible crimes to it:
https://github.com/TomArrow/MaxCLLFindAVS

Result is an AVS+ filter that kinda does what I needed, analyzing MaxFALL and MaxCLL of a HDR PQ clip. Read the README for usage, if anyone needs it.

The code is a horrifying mess, but I'm lazy so I can't say if and when I will clean it up. Feel free to do pull requests though.

Edit: The plugin is slow as hell at the moment. I figure it needs to be vectorized, optimized, quadrupolized and whatnot.

Edit 2: Sped it up a little with some caching. Sometimes the textfile doesn't get saved, not sure why.

Last edited by TomArrow; 29th December 2019 at 12:52.
TomArrow is offline   Reply With Quote
Old 29th December 2019, 14:06   #11  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
Do you use any low pas filtering?
I think some pro tools use gentle filtering to avoid single pixels to be taken into account and dictate end result (specially when those can be overshoots from compression).

There is also this:
https://github.com/HDRWCG/HDRStaticMetadata
kolak is offline   Reply With Quote
Old 29th December 2019, 18:09   #12  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
No, I haven't. I'm not sure whether I agree with the reasoning, because this would make individual bright pixels also impossible to accomplish.

I also haven't done the weighting of the color channels that the repository you linked does. I naively assumed that I could simply average every single 16-bit value (that is not the alpha channel). I should probably read the specification on how to calculate this stuff. I'll clarify this in the README to avoid confusion.

Thanks for linking that repo. I'm happier with mine in the end because creating TIFFs eats up a lot of space that I don't necessarily have.
TomArrow is offline   Reply With Quote
Old 29th December 2019, 20:01   #13  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
I'm not 100% sure if it's needed or not but I know some tools use it. It also depends what you use this value for.

I'm not sure about your formula, but there should be proper, clearly defined formula by SMPTE etc. It should not be based on some assumption.

Look here:
https://spaces.hightail.com/space/nEaXy

there is ZIP with many documents about HDR.
Look in Study Group On High-Dynamic-Range-HDR-Ecosystem.pdf, page 43.

Looks like it's bit more complex than you may think.

Creating TIFF is pain in the xx for sure.

Last edited by kolak; 29th December 2019 at 20:14.
kolak is offline   Reply With Quote
Old 29th December 2019, 21:33   #14  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
Yeah it's fair enough. I suppose anyone can run a blur on the video before passing it into the calculation, heh. Though it's probably fair to point out that blurring in PQ space might give improper results.

Okay, so according to the Study Group on Page 43 I'm doing MaxCLL right. MaxFALL it does differently than I do. Not necessarily hard to implement it the way they do, but I don't quite understand their logic. They are taking the brightest channel of each pixel and averaging that value across the screen. Meanwhile I take the average of each channel of each pixel across the screen. But the value is supposed to be the average brightness of the frame. How does taking the brightest channel of each pixel and averaging that give me the average brightness of the image? That's just the average brightness of the brightest channels. I'm confused. :P

Are you sure that document is from an official source? It just doesn't seem quite right.

However it should be trivial to implement that if it is indeed the way it's supposed to be. Just a small change really.
TomArrow is offline   Reply With Quote
Old 30th December 2019, 17:43   #15  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
Okay, I've updated it to use the official MaxFALL algorithm, so it should give the proper values now.

It's possible there are still some bugs in it of course, as with any software, but judging purely by my logic, it should now give proper readings conforming to the standard, if the SMPTE recommendation is correct.

The old algorithm still exists as optional maxFallAlgorithm=1, but by default it will use the official algorithm.

I tried it on a PQ image I had at hand and the new algorithm gives slightly higher MaxFALL readings, which is to be expected. The difference might be greater or smaller depending on the source material's saturation.
TomArrow is offline   Reply With Quote
Old 1st January 2020, 18:57   #16  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,259
@TomArrow: https://github.com/HDRWCG/HDRStaticMetadata might be interesting to compare your calculations,...
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 1st January 2020, 21:49   #17  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
Quote:
Originally Posted by Selur View Post
@TomArrow: https://github.com/HDRWCG/HDRStaticMetadata might be interesting to compare your calculations,...
Could you do it with some test file(s) and then compare results with me using the same files? I don't know how to compile that code.
TomArrow is offline   Reply With Quote
Old 2nd January 2020, 21:02   #18  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,259
Sorry, just stumbled over it and thought about this thread and that it might be interesting.
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 2nd January 2020, 21:39   #19  |  Link
TomArrow
Registered User
 
Join Date: Dec 2017
Posts: 90
Ah I see. kolak linked it above too. I looked at the code and my algorithm is identical to it from what I can tell, however it's always possible I made some mistake implementing the algorithm, so checking it against another implementation can never hurt. But as I said, I have no idea how to compile it. It only has Linux compiling instructions and needs several dependencies, I'd need a way to do it in Visual Studio (I'm a C++ noob).
TomArrow is offline   Reply With Quote
Old 3rd January 2020, 21:21   #20  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by kolak View Post
Do you use any low pas filtering?
I think some pro tools use gentle filtering to avoid single pixels to be taken into account and dictate end result (specially when those can be overshoots from compression).
The metadata is derived from the uncompressed RGB source, not compressed output. So even if there were overshoots from compression, that shouldn't impact the metadata at all. Same deal when downscaling and chroma subsampling. Even though the scaling is a low-pass filter, the metadata is still supposed to be that of the full-rez uncompressed source.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book

Last edited by benwaggoner; 3rd January 2020 at 21:23. Reason: Added detail
benwaggoner is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 14:53.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.