Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Announcements and Chat > General Discussion

Reply
 
Thread Tools Search this Thread Display Modes
Old 19th December 2017, 16:16   #1  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,810
Emulating MadVR's HDR to SDR tonemapping using FFMPEG

sample
http://www.4ktv.de/sony-camp-2016-hdr-version-download/

MadVR - frame 3596


FFMPEG
Code:
ffmpeg.exe -ss 00:01:00.12 -i Sony_4K_HDR_Camp.mp4 -frames 1 -vf zscale=transfer=linear,tonemap=clip,zscale=transfer=bt709,format=yuv420p c:\users\dave\desktop\ffmpeg.png


Does anybody know how to tweak default values (param). I really do not know where those values should be put in command line.
https://ffmpeg.org/ffmpeg-filters.html#tonemap
Atak_Snajpera is offline   Reply With Quote
Old 19th December 2017, 17:25   #2  |  Link
richardpl
Registered User
 
Join Date: Jan 2012
Posts: 271
It is rocket science:

Code:
ffmpeg.exe -ss 00:01:00.12 -i Sony_4K_HDR_Camp.mp4 -frames 1 -vf zscale=transfer=linear,tonemap=tonemap=clip:param=1.0:desat=2:peak=0,zscale=transfer=bt709,format=yuv420p c:\users\dave\desktop\ffmpeg.png
richardpl is offline   Reply With Quote
Old 19th December 2017, 18:42   #3  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,810
Thank you!
Atak_Snajpera is offline   Reply With Quote
Old 20th December 2017, 16:06   #4  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,810
I have another question. Why I'm getting this error when I try to work from .avs script.

SCRIPT
Code:
LoadPlugin("C:\Users\Dave\Documents\Delphi_Projects\RipBot264\_Compiled\Tools\AviSynth plugins\ffms\ffms_latest\x64\ffms2.dll")
video=FFVideoSource("E:\_Video_Samples\mp4\Sony_4K_HDR_Camp.mp4",cachefile = "C:\Temp\RipBot264temp\job1\Sony_4K_HDR_Camp.mp4.ffindex")
return video
Code:
"ffmpeg.exe"  -i "C:\Temp\RipBot264temp\job1\job1.avs" -vf zscale=transfer=linear,tonemap=tonemap=hable:param=1.0:desat=0:peak=10,zscale=transfer=bt709,format=yuv420p C:\Temp\test.y4m


When I use file directly instead of script it works fine
Code:
"ffmpeg.exe"  -i "E:\_Video_Samples\mp4\Sony_4K_HDR_Camp.mp4" -vf zscale=transfer=linear,tonemap=tonemap=hable:param=1.0:desat=0:peak=10,zscale=transfer=bt709,format=yuv420p C:\Temp\test.y4m
Atak_Snajpera is offline   Reply With Quote
Old 20th December 2017, 16:53   #5  |  Link
dipje
Registered User
 
Join Date: Oct 2014
Posts: 268
out of curiousity? did you try the other tonemapping algorithms? From the description 'mobius' seems promising.

About your avs filter not seeing any path.. I'm _a bit_ stumped.. yuv420p10le in both cases.

What I'm guessing here (really guessing):
The tonemap filter only accepts float values, so it's trying to go from yuv420p10le to some yuv444-floating point.
I'm guessing (since yuv444p32f or something is not in my pix_fmts list) that the only filter capable of producing float-video is the zscale filter.
And the zscale filter refuses to work if you don't say what kind of color-matrices and such it needs for yuv / colorspace conversion.

In your original .mp4, that information is available from the stream (you see it listed after the yuv444p10le) but the .avs reader in ffmpeg is not reporting any colorspace/color-primaries information zo scale refuses to work.
dipje is offline   Reply With Quote
Old 20th December 2017, 21:15   #6  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,810
I tried this ...
Code:
ffmpeg.exe -i C:\Temp\RipBot264temp\job1\job1.avs -color_range 1 -color_primaries bt2020 -color_trc smpte2084 -colorspace bt2020nc -vf zscale=transfer=linear,tonemap=tonemap=hable:param=1.0:desat=0:peak=10,zscale=transfer=bt709,format=yuv420p -strict -1 C:\Temp\test.y4m
... and still I'm getting the same error.


Last edited by Atak_Snajpera; 20th December 2017 at 21:18.
Atak_Snajpera is offline   Reply With Quote
Old 21st December 2017, 10:01   #7  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
The error is from z.img, not FFmpeg, so its likely some missing information somewhere.
You can probably override the input information by providing the -color* options before the -i option. Anything after -i refers to the output.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 21st December 2017 at 10:03.
nevcairiel is offline   Reply With Quote
Old 21st December 2017, 10:06   #8  |  Link
dipje
Registered User
 
Join Date: Oct 2014
Posts: 268
nah that won't work. -color_range parameters etc are for container work or something in ffmpeg, the filters never see them. You have to tell it the zscale filter itself (remember, I'm still guessing). zscale has rangein/primariesin/matrixin/transferin parameters

BTW, recompiled my ffmpeg to get tonemap, and did a few tests with mobius/hable and so far they're not yet the answer I was hoping for .
dipje is offline   Reply With Quote
Old 21st December 2017, 10:27   #9  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
I would generally recommend to avoid AviSynth because of all these issues however. Even more so with HDR, all the extra HDR metadata is also lost from AviSynth, and I don't think there even is any way to specify it manually (although I don't know if zscale uses that, but if not, it may in the future)
If FFmpeg is allowed to see the input file directly, then it'll all fall in place magically.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 21st December 2017, 11:03   #10  |  Link
dipje
Registered User
 
Join Date: Oct 2014
Posts: 268
-vf zscale=tin=smpte2084:min=bt2020nc:pin=bt2020:rin=tv:t=smpte2084:m=bt2020nc:p=bt2020:r=tv,zscale=t=linear,tonemap=tonemap=clip,zscale=t=bt709,format=yuv420p

this works! (man that took some time to figure out :S).

With the first 'zscale' I'm basically doing a sort of no-op, but I'm telling zscale what kind of input data it is, and I'm telling it to convert it to 'the same'. But now, the information of what it is is known in the ffmpeg filter chain, so the next zscale=t=linear will now work (and convert it to gbrpf32le apparently).

if I do it in a single zscale filter instance, I'm hitting issues that the zscale implementation inside of ffmpeg has no way to convert to RGB.. and the linear information can only be RGB (a 'matrix=rgb' is missing but needed for this apparently).

I've tested this on a .y4m file that (just like the Avisynth reader) has no information on what kind of format the yuv stream is in besides the planar configuration.

edit: Actually, mobius and hable tonemapping options (even reinhard) don't look that bad now. With all 3 of them you clearly see a bit more detail / local contrast in the water in the opening at that 1 minute opening mark for example. Still have the feeling it's a bit too desaturated

Last edited by dipje; 21st December 2017 at 11:18.
dipje is offline   Reply With Quote
Old 21st December 2017, 11:28   #11  |  Link
dipje
Registered User
 
Join Date: Oct 2014
Posts: 268
Think I got the desaturation a bit figured out. You / we never change the color primaries and/or the colormatrix. That means the primaries are still in bt2020. We tonemap the brightness but not much else.

There might be a simpler method, but what I do:
zscale=t=linear,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable,zscale=t=bt709:m=bt709:r=tv

(after the whole 'setting the correct input format' thing).

The first zscale=t=linear turns the YUV format into linear-space-RGB. (Linear space is only a thing in RGB). But the colorprimaries are still in bt2020. (Note, not the colormatrix. The colormatrix is a YUV only thing. But the colorprimarie is basically your colorspace).
Then comes the format=gbrpf32le to turn it all into floating-point RGB. _Then_ do we tell zscale to change the primaries to bt709. According to the zimg documentation changing primaries need to be done in linear space anyway, and this way we make sure the data is linear first. Because we're changing from bt2020 to bt709, we actually start to clip the data at very saturated colors. This is why we first convert to floating point, so this data isn't lost, it's just out-of-gamut.
Then we do the tonemapping, and because it now has data that falls outside of the 0.0 - 1.0 range, it can do it's thing and start desaturating the colors where it thinks it's needed.

After the tonemapping, we use zscale to go back to 'the usual' SDR config of bt709 gamma transfer, and switch to YUV with bt709 colormatrix (the primaries were already in bt709) and we make sure the go to limited range YUV. You may also set the dither options for zscale because we're now requesting it to go down to 8bit. Zscale gets the data from the filter-chain that 'yuv420p' is requested after it and it does the conversion (I believe this is how it works, thus not using the ffmpeg scaler / swscale for this thing). IF you put a 'showinfo' filter between the last zscale and the final 'format=yuv420p' you'll see that the data is already yuv420p after zscale (so I'm guessing zscale is doing the conversion).

edit:
in the 'zscale=t=linear' filter, you might want to add the npl parameter. So it becomes something like 'zscale=t=linear:npl=100'.
I have the feeling it maps the smpte2084 curve and peak data to something that is more alike for a 100 nits screen. This will cause a lot of values to go out of scope (float < 0.0 or float > 1.0, specially the last thing). But that is where the tonemapping filter comes in: How to deal and what to do with values that are outside the display range.
The 'peak=' parameter to the tonemapping filter doesn't seem to have much effect on my side (at least when using hable algorithm), while with 'npl' I can control the final output brightness. The higher the npl= value, the darker the final output video (Because it's meant for displays that have / can do a higher brightness ).

If I use 'hable' and NOT set any peak information and read directly from the Sony .mp4 file, it goes bonkers. I need to set npl=100000 in the first zscale step to get a normal looking video there.

If I use a .y4m as input and don't set the npl, it looks OK, so there are quite some differences in how things are handled.. the bt2020nc is for 'non constant luminace' right? Is there extra metadata in the stream that constantly tells the maximum luminance for a scene??

Last edited by dipje; 21st December 2017 at 11:47.
dipje is offline   Reply With Quote
Old 8th December 2021, 06:13   #12  |  Link
damian101
Registered User
 
damian101's Avatar
 
Join Date: Feb 2021
Location: Germany
Posts: 18
Quote:
Originally Posted by dipje View Post
Think I got the desaturation a bit figured out. You / we never change the color primaries and/or the colormatrix. That means the primaries are still in bt2020. We tonemap the brightness but not much else.

There might be a simpler method, but what I do:
zscale=t=linear,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable,zscale=t=bt709:m=bt709:r=tv

(after the whole 'setting the correct input format' thing).

The first zscale=t=linear turns the YUV format into linear-space-RGB. (Linear space is only a thing in RGB). But the colorprimaries are still in bt2020. (Note, not the colormatrix. The colormatrix is a YUV only thing. But the colorprimarie is basically your colorspace).
Then comes the format=gbrpf32le to turn it all into floating-point RGB. _Then_ do we tell zscale to change the primaries to bt709. According to the zimg documentation changing primaries need to be done in linear space anyway, and this way we make sure the data is linear first. Because we're changing from bt2020 to bt709, we actually start to clip the data at very saturated colors. This is why we first convert to floating point, so this data isn't lost, it's just out-of-gamut.
Then we do the tonemapping, and because it now has data that falls outside of the 0.0 - 1.0 range, it can do it's thing and start desaturating the colors where it thinks it's needed.
Don't do this. Primarily because tonemap desat will clip out-of-gamut pixels to black, which produces ugly artifacts. I also can't quite follow your reasoning, even without desat I can only think of negative possible side effects of this. If you want to prevent desaturation, just set desat to 0. Which is something I don't recommend btw. You always want at least some desaturation for clipped pixels, otherwise some pixels will clip to white way too late, or even never, when one of the three primary colors is completely missing, regardless of brightness This can create unrealistic brightness differences, which is way more distracting than desaturation, especially without reference. Smaller desat value means more desaturation. Don't go below 1 though, at least at 0.2 and lower desat I encountered very visible desaturation on unclipped pixels as well. The default of 2.0 is fine, I normally go lower to catch some edge cases (did some encodes with 1.3 and 1.2, with higher npl lower desat values tend to be a better idea), I rarely mind the desaturation in movies, which might be because I tend to use very high npl (400-700) with mobius, so only extremely bright pixels are desaturated at all. If you want to prevent desaturation at the cost of less accurate brightness differences I recommend a higher desat value, 4 for example. Wouldn't really recommend going higher than that, way too many cases where brighter light sources clip later than less bright light sources because of color differences.
damian101 is offline   Reply With Quote
Old 21st December 2017, 13:29   #13  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,810
dipje you are my hero! Thank you very much for your help! I would never figure that on my own.

I'm posting example command line for those who what to convert HDR to SDR and pipe result to encoder.

Code:
ffmpeg.exe -loglevel panic -i script.avs -vf zscale=tin=smpte2084:min=bt2020nc:pin=bt2020:rin=tv:t=smpte2084:m=bt2020nc:p=bt2020:r=tv,zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable:desat=0,zscale=t=bt709:m=bt709:r=tv,format=yuv420p -strict -1 -f yuv4mpegpipe - | x264_x64.exe --stdin y4m --output "C:\video.264" -

Last edited by Atak_Snajpera; 21st December 2017 at 17:03.
Atak_Snajpera is offline   Reply With Quote
Old 9th February 2018, 20:26   #14  |  Link
tyee
Registered User
 
Join Date: Oct 2001
Posts: 416
Quote:
Originally Posted by Atak_Snajpera View Post
dipje you are my hero! Thank you very much for your help! I would never figure that on my own.

I'm posting example command line for those who what to convert HDR to SDR and pipe result to encoder.

Code:
ffmpeg.exe -loglevel panic -i script.avs -vf zscale=tin=smpte2084:min=bt2020nc:pin=bt2020:rin=tv:t=smpte2084:m=bt2020nc:p=bt2020:r=tv,zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable:desat=0,zscale=t=bt709:m=bt709:r=tv,format=yuv420p -strict -1 -f yuv4mpegpipe - | x264_x64.exe --stdin y4m --output "C:\video.264" -
@Atak_Snajpera - I tried your command line and get this error - x264 [error]: could not open input file `-' - Do you know why? I'm using 32 bit ffmpeg and x264 because I have 32 bit avisynth installed.
tyee is offline   Reply With Quote
Old 10th October 2018, 09:31   #15  |  Link
SpasV
Guest
 
Posts: n/a
Quote:
Originally Posted by Atak_Snajpera View Post
dipje you are my hero! Thank you very much for your help! I would never figure that on my own.

I'm posting example command line for those who what to convert HDR to SDR and pipe result to encoder.

Code:
ffmpeg.exe -loglevel panic -i script.avs -vf zscale=tin=smpte2084:min=bt2020nc:pin=bt2020:rin=tv:t=smpte2084:m=bt2020nc:p=bt2020:r=tv,zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=tonemap=hable:desat=0,zscale=t=bt709:m=bt709:r=tv,format=yuv420p -strict -1 -f yuv4mpegpipe - | x264_x64.exe --stdin y4m --output "C:\video.264" -
Thank you for the post.
The pipe -f yuv4mpegpipe - | doesn't work, but -f rawvideo - | x265 - does.

The CL generates 3840x2160 rawvideo. If 1920x1080 is needed instead, add one more resizing filter:

format=yuv420p,zscale=s=1920x1080

Thank you for the help with this topic.

There is no need for using script.avs to introduce a Source filter. ffmpeg has one.

Last edited by SpasV; 10th October 2018 at 09:35.
  Reply With Quote
Old 10th October 2018, 10:06   #16  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
Quote:
Originally Posted by SpasV View Post
The pipe -f yuv4mpegpipe - | doesn't work
It's
Code:
-strict -1 -f yuv4mpegpipe - | x265 - --y4m
or
Code:
-strict -1 -f yuv4mpegpipe - | x264 - --demuxer y4m
respectively.

(Atak_Snajpera had the -strict -1 part in his example.)
sneaker_ger is offline   Reply With Quote
Old 8th December 2021, 06:57   #17  |  Link
damian101
Registered User
 
damian101's Avatar
 
Join Date: Feb 2021
Location: Germany
Posts: 18
Quote:
Originally Posted by SpasV View Post
Thank you for the post.
The CL generates 3840x2160 rawvideo. If 1920x1080 is needed instead, add one more resizing filter:

format=yuv420p,zscale=s=1920x1080
Resize before tonemapping to speed tonemapping up a lot.
damian101 is offline   Reply With Quote
Old 21st December 2017, 13:40   #18  |  Link
dipje
Registered User
 
Join Date: Oct 2014
Posts: 268
all HDR movies are not made the same, so you'll still have to check movie by movie, or even scene by scene how it turns out. There are scenes in the sony demo for instance which I think are too dark.. or I'm tempted to up the 'shadows' slider in Lightroom so to speak .
But other scenes are clearly very bright and the algorithm is working to map it, so don't really know what to do about that.
dipje is offline   Reply With Quote
Old 21st December 2017, 17:08   #19  |  Link
Atak_Snajpera
RipBot264 author
 
Atak_Snajpera's Avatar
 
Join Date: May 2006
Location: Poland
Posts: 7,810
It looks like desat=0 works the best. No idea why by default desat is enabled.
Quote:
The default of 2.0 is somewhat conservative and will mostly just apply to skies or directly sunlit surfaces. A setting of 0.0 disables this option.
desat = 2 (default)

desat = 0 (disabled)


Another reason why desat should be disabled!

desat = 2 (default)


desat = 0 (disabled)

Last edited by Atak_Snajpera; 21st December 2017 at 17:16.
Atak_Snajpera is offline   Reply With Quote
Old 21st December 2017, 19:16   #20  |  Link
kolak
Registered User
 
Join Date: Nov 2004
Location: Poland
Posts: 2,843
So you say that this crazy oversaturated/unreal footage of guy next to fire is better?
kolak is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:36.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.