Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Video Encoding > High Efficiency Video Coding (HEVC)

Reply
 
Thread Tools Search this Thread Display Modes
Old 22nd May 2018, 20:20   #61  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by sneaker_ger View Post
Some specs may even require it. E.g. UltraHD Blu-Ray demands type 2 for all BT.2020 content.
The scary thing is, I don't know that either encoders or playback are actually doing the proper RGB <-> 4:2:0 w/ chromaloc 2 correct placement.


It may be that all the UHD HDR stuff actually is using the chromaloc 0 positioning, which works because both encoders and decoders ignore it.


Any error at 2160p isn't likely to be visible; worse case it would be 25% the impact CUE was at 1080p.

But I get nervous about what might happen if some encoders and/or players do it correctly and some don't, for content at lower resolutions. And no one seems to have a good test pattern for this.
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 18th November 2018, 17:45   #62  |  Link
no-one
Registered User
 
Join Date: Dec 2011
Posts: 46
Quote:
Originally Posted by Qarmaa View Post
Problem solved using last build (not stable). Now video encoded with --uhd-bd mux in seamless correctly.
I found this error too. can you tell me how to fix this?

Thank you.
no-one is offline   Reply With Quote
Old 19th February 2019, 03:04   #63  |  Link
redbtn
Registered User
 
redbtn's Avatar
 
Join Date: Jan 2019
Location: Russia
Posts: 105
chromaloc

Quote:
Originally Posted by benwaggoner View Post
The whole --chromaloc 2 thing is a little suspect for me in general. It's required by UHD-BD, but I don't know that sources are ever converted to that, nor if decoders/displays correct for that in their YUV-RGB conversion. So you might removing ut_h_chr_pos etcetera from ffmpeg and remove chromaloc 2 from x265.
I read this topic and another one (https://forum.doom9.org/showthread.p...41#post1766641) and dont understand do i need --chromaloc 2 or --chromaloc 0 for encode 4k HDR or 4k->1080p HDR.
VapourSynth ClipInfo() show Chroma Location: Left

VapourSynth Docs:
Quote:
Possible chroma locations (ITU-T H.265 Figure E.1): left, center, top_left, top, bottom_left, bottom
So Left mean 0 i think.

I'm confused. What the right way?
PS: Im encode through vspipe.exe --y4m video.vpy -

Last edited by redbtn; 19th February 2019 at 03:06.
redbtn is offline   Reply With Quote
Old 19th February 2019, 11:00   #64  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
--chromaloc is only a flag. You need to set it so fits your content. If your source is chromaloc 2 input and you want chromaloc 0 output you need to filter the content inside VapourSynth accordingly.
sneaker_ger is offline   Reply With Quote
Old 19th February 2019, 11:25   #65  |  Link
redbtn
Registered User
 
redbtn's Avatar
 
Join Date: Jan 2019
Location: Russia
Posts: 105
Quote:
Originally Posted by sneaker_ger View Post
--chromaloc is only a flag. You need to set it so fits your content. If your source is chromaloc 2 input and you want chromaloc 0 output you need to filter the content inside VapourSynth accordingly.
Thank you! But I do not understand unfortunately what chromaloc in my source. text.ClipInfo() show Unknown if i just open source via LWLibavSource, if i resize 2160p to 1080p, then Left. How to correctly determine chromaloc?

My VS script
Quote:
clip = core.lsmas.LWLibavSource(source="source.mkv", format="YUV420P10")
clip = core.std.AssumeFPS(clip, fpsnum=24000, fpsden=1001)
clip = core.std.CropRel(clip=clip, left=0, right=0, top=264, bottom=264)
clip = core.fmtc.resample(clip=clip, kernel="spline64", w=1920, h=816, interlaced=False, interlacedd=False)
clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P10).text.ClipInfo()
clip.set_output()
Source Mediainfo
Quote:
Video
ID : 1
Format : HEVC
Format/Info : High Efficiency Video Coding
Commercial name : HDR10
Format profile : Main 10@L5.1@High
Codec ID : V_MPEGH/ISO/HEVC
Duration : 1 h 58 min
Bit rate : 51.9 Mb/s
Width : 3 840 pixels
Height : 2 160 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 23.976 (24000/1001) FPS
Color space : YUV
Chroma subsampling : 4:2:0 (Type 2)
Bit depth : 10 bits
Bits/(Pixel*Frame) : 0.261
Stream size : 42.9 GiB (100%)
Language : English
Default : Yes
Forced : No
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Mastering display color primaries : Display P3
Mastering display luminance : min: 0.0000 cd/m2, max: 1000 cd/m2
Maximum Content Light Level : 1000 cd/m2
Maximum Frame-Average Light Level : 73 cd/m2

Attached Images
  
redbtn is offline   Reply With Quote
Old 12th March 2019, 17:26   #66  |  Link
blublub
Registered User
 
Join Date: Jan 2015
Posts: 118
Hi

Do I need special switches for HDR besides the ones I already use?

I currently use:
--profile main10
--output-depth 10
--colorprim bt2020

My encoded movies work as HDR on my TV look realy good.

So my primary question is do I need "--hdr-opt" as it is often referred to in this forum but mostly on older threads.
blublub is offline   Reply With Quote
Old 12th March 2019, 18:51   #67  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
You should use this, yes. It applies offsets to chroma QPs to improve quality for HDR encoding.

You should really signal --transfer smpte2084 (to signal the PQ curve) and --colormatrix bt2020nc. You should also signal HDR10 static metadata e.g. --master-display and --max-cll if available. This will give your display as much info as possible to present the best possible HDR image.

https://x265.readthedocs.io/en/default/cli.html
Blue_MiSfit is offline   Reply With Quote
Old 13th March 2019, 10:01   #68  |  Link
blublub
Registered User
 
Join Date: Jan 2015
Posts: 118
Quote:
Originally Posted by Blue_MiSfit View Post
You should use this, yes. It applies offsets to chroma QPs to improve quality for HDR encoding.

You should really signal --transfer smpte2084 (to signal the PQ curve) and --colormatrix bt2020nc. You should also signal HDR10 static metadata e.g. --master-display and --max-cll if available. This will give your display as much info as possible to present the best possible HDR image.

https://x265.readthedocs.io/en/default/cli.html
uuuuh ok

I can easily set:
--transfer smpte2084
--colormatrix bt2020nc
--hdr-opt

But how do I set " --master-display" and "--max-cll" that looks really complicated.
blublub is offline   Reply With Quote
Old 13th March 2019, 10:27   #69  |  Link
sneaker_ger
Registered User
 
Join Date: Dec 2002
Posts: 5,565
Most people look up the source parameters using MediaInfo and then use those values. For master-display these have to be re-calculated but most sources seem to use the very same parameters for the color coordinates anyway where MediaInfo will say "Mastering display color primaries : Display P3" and it's the exact values you find as an example in the x265 docs.
https://x265.readthedocs.io/en/defau...master-display

Also if MediaInfo says "Chroma subsampling : 4:2:0 (Type 2)" set --chromaloc 2.
sneaker_ger is offline   Reply With Quote
Old 13th March 2019, 12:09   #70  |  Link
blublub
Registered User
 
Join Date: Jan 2015
Posts: 118
Quote:
Originally Posted by sneaker_ger View Post
Most people look up the source parameters using MediaInfo and then use those values. For master-display these have to be re-calculated but most sources seem to use the very same parameters for the color coordinates anyway where MediaInfo will say "Mastering display color primaries : Display P3" and it's the exact values you find as an example in the x265 docs.
https://x265.readthedocs.io/en/defau...master-display

Also if MediaInfo says "Chroma subsampling : 4:2:0 (Type 2)" set --chromaloc 2.
OK, thx.

I checked some encoded HDR files and the max-cll values are set correctly. Also the master-display information is included but I can't see those values in the ripped original files with media info - kinda strange.

Last edited by blublub; 13th March 2019 at 12:24.
blublub is offline   Reply With Quote
Old 2nd April 2020, 21:46   #71  |  Link
THU22
Registered User
 
THU22's Avatar
 
Join Date: Sep 2016
Posts: 16
What does --max-cll and --master-display actually do?

I am trying to get into HDR encoding, but it seems a bit complicated.

Basically this is my command line for SDR videos: --crf 18.0 --fps 24000/1001
This gives me actual perfect quality with 12-bit x265 (dark scenes and fades are perfect too). I do not see the point of using additional settings, I do not care about optimizing the bitrate, it is very low anyway.
I play videos on my PC (MPC-HC + madVR), so I do not care about compatibility either.

This is what I currently have for HDR: --crf 18.0 --fps 24000/1001 --hdr10 --colorprim bt2020 --transfer smpte2084 --colormatrix bt2020nc
HDR playback works and looks ok (LG OLED C8), but you cannot compare the HDR source and encode like you can with SDR. Can I be missing something by not using max-cll and master-display? What if I use them, but set them wrong?
THU22 is offline   Reply With Quote
Old 3rd April 2020, 16:44   #72  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by THU22 View Post
What does --max-cll and --master-display actually do?
--maxcll It specifies the nit value of the brightest single pixel (MaxCLL - Max Content Light Level), and the nit average nit value of the brightest single frame (MaxFall, or Max Frame Average Light Level).

Master Display encodes the color volume that the display used in mastering was. The idea is that pixels with values outside of what the mastering display could be clipped because the colorist couldn't have seen those differences. Higher end consumer TVs can deliver a lot more brightness than typical grading monitors.

Quote:
This is what I currently have for HDR: --crf 18.0 --fps 24000/1001 --hdr10 --colorprim bt2020 --transfer smpte2084 --colormatrix bt2020nc
HDR playback works and looks ok (LG OLED C8), but you cannot compare the HDR source and encode like you can with SDR. Can I be missing something by not using max-cll and master-display? What if I use them, but set them wrong?
Definitely add --hdr10 and --hdr10-opt.

That'll put in null values for the SEI metadata, which tells the TV that the values are undefined, which TVs know what to do with. It is much better to use null values than specify incorrect ones!
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 3rd April 2020, 18:44   #73  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
From what we've seen, the max-cll and master-display params end up doing little to nothing most of the time

You never know what future behavior will be, so I agree with the recommendation to make them exact or null.
Blue_MiSfit is offline   Reply With Quote
Old 3rd April 2020, 19:43   #74  |  Link
THU22
Registered User
 
THU22's Avatar
 
Join Date: Sep 2016
Posts: 16
Thanks for the info.

And can I use the 12-bit x265 encoder with a 10-bit source?

With SDR content, 12-bit encoding provides significantly better results (dark scenes and fades), but I wonder if it can somehow break the accuracy with HDR. 10-bit encoding definitely creates slight artifacts at low and medium bitrates during dark scenes and fades, 12-bit would probably eliminate that, just like with SDR content.
THU22 is offline   Reply With Quote
Old 3rd April 2020, 20:10   #75  |  Link
benwaggoner
Moderator
 
Join Date: Jan 2006
Location: Portland, OR
Posts: 4,750
Quote:
Originally Posted by Blue_MiSfit View Post
From what we've seen, the max-cll and master-display params end up doing little to nothing most of the time

You never know what future behavior will be, so I agree with the recommendation to make them exact or null.
For displays with a peak brightness lower than the MaxCLL, they can adjust the rolloff at the top of brightness based on it some. So, the lower the MaxCLL, the brighter the average frame can be due to less need for headroom.

I'm not familiar with MaxFALL being used much; could be used for average power level adjustments.

Dynamic metadata is much, much, much more useful!
__________________
Ben Waggoner
Principal Video Specialist, Amazon Prime Video

My Compression Book
benwaggoner is offline   Reply With Quote
Old 3rd April 2020, 20:44   #76  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
Quote:
Originally Posted by THU22 View Post
Thanks for the info.

And can I use the 12-bit x265 encoder with a 10-bit source?

With SDR content, 12-bit encoding provides significantly better results (dark scenes and fades), but I wonder if it can somehow break the accuracy with HDR. 10-bit encoding definitely creates slight artifacts at low and medium bitrates during dark scenes and fades, 12-bit would probably eliminate that, just like with SDR content.
You could... but nothing useful would be able to decode it
Blue_MiSfit is offline   Reply With Quote
Old 3rd April 2020, 21:06   #77  |  Link
THU22
Registered User
 
THU22's Avatar
 
Join Date: Sep 2016
Posts: 16
As I said, I only encode videos for personal use and I play them on PC, so compatibility is of no concern. I only care whether it provides accurate results (as it does with SDR content).

On a side note, are hardware decoders not able to decode 12-bit video? It does not seem to require any more power (GPU or CPU) when played on PC.
THU22 is offline   Reply With Quote
Old 4th April 2020, 01:00   #78  |  Link
Blue_MiSfit
Derek Prestegard IRL
 
Blue_MiSfit's Avatar
 
Join Date: Nov 2003
Location: Los Angeles
Posts: 5,988
Not generally as far as I'm aware, no.
Blue_MiSfit is offline   Reply With Quote
Old 4th April 2020, 05:56   #79  |  Link
microchip8
ffx264/ffhevc author
 
microchip8's Avatar
 
Join Date: May 2007
Location: /dev/video0
Posts: 1,843
Quote:
Originally Posted by THU22 View Post
As I said, I only encode videos for personal use and I play them on PC, so compatibility is of no concern. I only care whether it provides accurate results (as it does with SDR content).

On a side note, are hardware decoders not able to decode 12-bit video? It does not seem to require any more power (GPU or CPU) when played on PC.
I have an NV Shield TV (2019 model) and it can decode 12 bit
__________________
ffx264 || ffhevc || ffxvid || microenc
microchip8 is offline   Reply With Quote
Old 4th April 2020, 08:52   #80  |  Link
THU22
Registered User
 
THU22's Avatar
 
Join Date: Sep 2016
Posts: 16
Shield uses an NVIDIA GPU, so it is like a PC basically. You can use hardware acceleration on PC with 12-bit video.

But I find it weird if other devices cannot do it.
THU22 is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 22:50.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.