View Single Post
Old 25th September 2018, 18:45   #52733  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by pirlouy View Post
About the screenshots comparison, I don't know anything at HDR (and I'm not really interested). In the end, the best images are the one with better contrast isn't it ?
Is it right to say it looks exactly like "dynamic contrast" options from TV ?
The best images are those that look nearest to how the HDR content would look like on a perfect 10,000nits display. Of course it's hard for us to judge because nobody has such a display. But you can get an impression by telling madVR to tone map to e.g. 1,000nits or 4,000nits. The tone mapped images for e.g. 200nits should look the same, just much brighter. Exactly "the same" is not possible, though, because of limitations in your display technology. Also, it's a bit hard to judge for our eyes/brain, because the vastly different brightness levels make it hard to judge.

Quote:
Originally Posted by el Filou View Post
Maybe a stupid question, but I remember reading on this very forum that we shouldn't use the Sony Camp demo because its mastering metadata was rubbish.
Is there another, correctly encoded, source? Or does that not matter anymore with the newer madVR versions and it can handle material with incorrect metadata?
It's true that the metadata is broken/missing with this demo. However, a good video renderer still has to be able to handle this situation. Furthermore, madVR now has the ability to actually measure each video frame in the demo, so the metadata isn't strictly needed, anymore.

Quote:
Originally Posted by nevcairiel View Post
Speaking about SDR brightness, how do you guys manage the different brightness between HDR converted to SDR, and actual SDR? Surely one doesn't want the same brightness levels there. Manually? TVs certainly don't have much automation to speak of.
Well, I think many users are actually trying to get their HDR content just as bright as their SDR content. Which is why people are using very very low target nits settings like 100nits in madVR.

Personally, I don't think that's the purpose of HDR. So your question is quite valid. FWIW, the new "report BT.2020 to display" option might help a little, *if* your TV somehow reacts to that. But I suppose most TVs won't.

With my projector, SDR is relatively bright, and HDR is quite a bit darker. But that's fine for me. Of course I could try to make SDR darker to match HDR, but what's the point of that? Our eyes adjust very quickly to different brightness levels, at least in a bat cave like mine...

Quote:
Originally Posted by j82k View Post
I can understand why LG limited the brightness for SDR. There are more and more cases with burn-in and if people could watch CNN or whatever all day with 500+ nits they probably would...
Actually, I think LG probably didn't limit the SDR brightness, instead they compromise for HDR by using the white subpixels more extensively. Which means color accuracy might suffer. At least that's what I've been told by an industry insider.

Quote:
Originally Posted by blu3wh0 View Post
May I also suggest putting the option to send BT.2020 metadata under the HDR options, as probably 99% of videos in this range would be in HDR. I'm sure there are people with use for it in calibration as well, but that means if I want to use it I would also need to create a profile to seperate rec 709 and BT.2020. I would think most TVs would handle calibrations for both.
When I added this option I thought about where to put it and first considered the HDR page. But then I thought it would fit better in the calibration page because if this option switches your TV into different modes, you'll also need different calibration settings/profiles.

Quote:
Originally Posted by jokerb47 View Post
Hello. I have a following bug. When I wake up my monitor by typing account password to unlock my system (win10x64 1709), and try to resume the playback of a video that was paused, I get only the sound playing, but the video plays for a second and freezes. And only reopening the file helps. I use madVR v0.92.16, but I experienced same bug with v0.92.14.
I have a script that puts my monitors to sleep when I lock the system. But this problem doesn't manifest if I unlock the system right away, only when I leave it locked for some time.
I'm aware of this problem, probably Direct3D erroring out after a long sleep. I don't have a solution for that at this point. Maybe some time in the future.

Quote:
Originally Posted by creativeopinion View Post
Also, my understanding is that changing this HDR module from normal to 'ON' is permanent. Not permanent like there is no option to go back but this would have an effect on all content including 'non HDR' content which then means you would have to to swtich this ON and OFF all the time depending on the content you want to watch. Feel free to correct me if I'm wrong but this doesn't sound like something I would want to do every single time between watching SDR/HDR content in order to make sure I can have the best quality available.
True. But isn't there a "brightness" slider in the menu with which you can set differently for each HDMI input? This way maybe you can turn the brightness down for SDR sources, and use in on full power only for your HTPC?

Quote:
Originally Posted by creativeopinion View Post
I don't know what other OLED users experienced using 'convert HDR to SDR' option so it's hard to tell if we can really take advantage from using this. I know one other person except me that experienced the same so going anything above 120 nits produces the picture that is just too dark and at the same time using 120 nits means you are watching content that is way too bright than intended.
That doesn't seem very logical to me. If you find the image with a setting above 120 nits too dark, then by all means use 120 nits. If it's brighter than intended then who cares? Doesn't the same logic apply just the same to using HDR passthrough mode? Is the image then just as bright as 120 nits? If so, isn't that brighter than intended, too?

Quote:
Originally Posted by Warner306 View Post
The SDR gamma curve makes more of the screen brighter, so the chances of burn-in increase.
That doesn't make any sense to me. The gamma curve (PQ or power or BT.709) is only a communication format between HTPC and TV. Both PQ and power/BT.709 gamma should end up in the same data inside of your TV, if madVR and TV are both setup correctly.

The key difference is that sending PQ means your TV will activate its own HDR -> SDR algorithm. While when sending power/BT.709 gamma, the TV will deactivate it's HDR -> SDR algorithm, and madVR can do that processing instead.

Quote:
Originally Posted by Warner306 View Post
I don't totally understand why someone with a display as bright as an OLED would want to use an SDR gamma curve to display HDR content. Projector owners champion this configuration, but it works much better when the display has limited brightness. A PQ curve will do a superior job of separating the various image elements with different amounts of brightness, like it was meant to be presented. A bright sun, for example, will stand out more when presented in PQ than SDR. SDR gamma tends to make large portions of the image brighter or darker.
That's all completely incorrect, IMHO.

Quote:
Originally Posted by Warner306 View Post
It is also possible to invite banding when the SDR curve is stretched too far, and you will reduce its accuracy without a proper calibration.
Nope, not true. madVR never ever produces banding. There's no banding even if you lower the display bitdepth to 4bit. Give it a try, if you don't believe me.

Quote:
Originally Posted by j82k View Post
Also I've never seen madVR produce any banding when it wasn't a source problem, quite the opposite I think 8-bit with dithering gives smoother gradients than whatever the LG produces when feeding it with 10-bit.


Quote:
Originally Posted by SamuriHL View Post
Yes, the HDR->SDR is definitely too dark for us LG OLED users. 120 NIT target isn't going to fly.
You're saying 120nits target is too dark? Strange enough projector users are actually using *much* higher values on AVSForum. And projectors are much dimmer than OLEDs!

Quote:
Originally Posted by SamuriHL View Post
If I were to make any actual recommendations to other people it would be to leave the service menu alone and use HDR passthrough. Hopefully madshi can fix the current issues with the HDR using pixel math option.
Which current issues are you talking about?

Quote:
Originally Posted by huhn View Post
a 1000 nit HDR image on a 1000 nit display will have the same image between a HDR signal with 1000 nits and an SDR image properly converted from the HDR image.

gamma, PQ and HLG are just different ways to store the data.

the banding part is a myths a proper dithered image will not show banding. banding is usually produce by the flawed image processing of TVs or other panel related things.
when switching between modes on a TV the transistor to power the pixel will not magical be able to produce PQ from gamma or closer to gamma response.
Yes!!

Quote:
Originally Posted by Warner306 View Post
Just look at the PQ curve. It is very flat at the bottom (almost SDR gamma 2.40) and then rises rapidly when the brightness increases. The SDR gamma curve can't replicate this brightness response.
Given enough bitdepth (or dithering): Yes, it can.

Quote:
Originally Posted by Warner306 View Post
That is why the HLG curve was invented. SDR gamma was also replaced with PQ and HLG because tests found it leads to banding when the brightness range is stretched too far.
Please don't confuse encoding/mastering with playback/rendering. Very different things.

For encoding/mastering, dithering doesn't work well. The reason for that is that videos are lossily encoded, and lossy encoders have trouble with noise, grain and dithering. You need to use very high encoding bitrates to make sure that noise, grain and dithering doesn't get completely lost. And even with high bitrates, that stuff still gets lost to some extent. Because of that, for encoding/mastering it's important to have high bitdepth, and PQ helps there, too, with it's more intelligent gamma curves.

However, during playback/rendering, the whole chain is fully lossless. There's no lossy compression going on anywhere in between madVR and TV. So every dithered bit should reach the display untouched. As a result, bitdepth is much less important, because madVR's dithering is of very high quality.

PQ has the benefit (and curse) of defining an exact nits value for each pixel. But there's not a single consumer display (I know of) that can render BT.2020 10,000nits. So every TV out there has to process the PQ data in such a way that it is fairly represented within the technical limitations of the TV. This process of dumbing the PQ/HDR data down to the capabilities of the TV is usually called "tone mapping". You can let your TV do that, or you can let madVR do that. Which would you rather trust to do this processing in highest quality?

There's no HDR magic going on in TVs. Sending HDR content untouched as PQ to the TV from madVR doesn't magically turn your TV into a HDR miracle. Your TV will simply internally convert PQ to whatever internal format the TV works with. If you let madVR send power/BT.709 gamma to your TV, you simply move the tone mapping from your TV to madVR. That's it. The only difference between an HDR and SDR display is that the HDR TV has a firmware which can do tone mapping. (Ok, some TVs might switch to "overdrive" mode when receiving PQ data.)

Quote:
Originally Posted by nevcairiel View Post
Sure, you can prevent banding with dithering, but you do that at the expense of the noise floor.
All these transfer functions exist to move bitdepth around into areas where it matters for the format in question, because bitdepth is ultimately limited.

Gamma/SDR transfer functions are designed for a limited range of brightness, ie. the SDR brightness, so it spreads the bits around for that. If you try to push 1000 nits of brightness into a image transfer designed for ~120 nits or so, there will be a downside to that, because its not aware of the increased range of brightness.

PQ is designed to keep more bitdepth for the relevant parts (ie. 0-100 or so), and waste less bits on the 900 remaining nits, which are of less importance then the actual image detail.

Of course you can mask these differences with dithering, but ultimately a PQ curve is going to give you a better bitdepth distribution for HDR images, which have the majority of image data in a small subsection of the bitdepth range, and only highlights in 90% of the remaining space - as the transfer should reflect that.
True, but this is much more important for encoding/mastering than for playback. I doubt anybody could see the dithering noise floor in madVR's 10bit output. *Maybe* when pressing your nose to a 3m wide screen. But then, most TV panels can't do 10bit+ native, anyway. So they will internally dither, thus increasing the noise floor on their own. So when talking about the transport format between HTPC and TV, I doubt PQ has any visible benefit whatsoever, in terms of either banding or noise floor.

Quote:
Originally Posted by j82k View Post
The thing with Oleds is, they are terrible at near-blacks. Not only the uniformity but also the near-black gradation so I totally don't mind some dithering noise which hides this ugliness a little bit and also makes the gradation a little smoother. I wish Oleds would just dither the near-blacks by default like plasmas did. This would solve alot of problems and only cause a bit of noise.

edit: madshi should implement some algorithm to madVR for Oled owners that does heavy dithering to near-blacks and less and less on brighter parts of the picture.
I've actually already thought about that! It's hard to develop such an algo, though, without being able to test it with an OLED. And I don't have an OLED atm.
madshi is offline   Reply With Quote