Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 22nd March 2018, 15:41   #49721  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by BatKnight View Post
I believe that when playing a 2160p 10bit HDR 23.976fps video and having madVR set up to output 10 bit, that all is going the best way possible.
What I don't understand is why when sending the HDR metadata, that results on NV HDR, I get banding and when not sending the HDR metadata, and manually enabling the OS HDR, I get perfect image and no banding... Why is OS HDR behaving differently than NV HDR for the same settings?
I think the problem here is the metadata. When the OS HDR toggle is enabled, it is my understanding Windows sends the color gamut and transfer function to the display, but not the metadata. The gamut and transfer function is enough for the display to enter its HDR mode. Tone and gamut mapping are done at the PC level. In fact, in the next update, Windows is releasing a calibration tool to change how HDR looks on your display. This is why HDR is always on.

When the Nvidia private API is used, the metadata is passed to the display untouched. The display uses its own processing to complete the tone and gamut mapping, not Windows. This would imply your display has issues processing a 10-bit HDR input. It's tone and gamut mapping is not of the highest-quality. This would explain why banding does not show for Manni on his display.

I could be wrong, but maybe a display is not the best at handling HDR processing? If true, I should delete my posts from yesterday.

Last edited by Warner306; 22nd March 2018 at 15:44.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 15:42   #49722  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by bran View Post
As I've said in previous posts, there is a lot more noticeable banding when sending "12-bit" to current-gen TVs compared to 10bit. Regardless of madVR or standalone UHD-players. So padded zeros or not - the result is more banding.

For example check out threads regarding Panasonics UB700/900 players - people were getting banding until Pana released new firmware enabling 10bit.

I myself got a lot less banding outputting 10bit from AMD compared to nvidias 12bit (Samsung JS9505).
But he said most 12-bit input is fine. It is only HDR that is the problem. And 12-bit with the OS HDR toggle is also fine. It has something to do with the HDR tone mapping.

Last edited by Warner306; 22nd March 2018 at 15:46.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 17:07   #49723  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
I have a couple more things to add since some keen interest is popular at the moment: I have no intentions of introducing any sort of Ad nauseam.

Early drivers offered 8,10, and 12bit. The 10bit option was removed in recent drivers for RGB afaik. So now it's just 8bit or 12bit to select.

When using RGB 4:4:4 12bit @23Hz (matching old faithful Allied 2160p HDR test refresh rate) understanding this is a one shot deal since it's going to revert to 8bit after a reboot with newer drivers:

Manni has pointed out his display accepts 12bit and dithers to 10bit correctly with no banding and retains after a reboot. If I'm not mistaken, Manni is also using a custom 3Dlut and not passing HDR through.

It should also be noted, no amount of madVR processing to reduce banding artifacts has any affect at all. Nor does the dithering algorithm used.

When using YCbCr 4:2:2 10/12bit @23Hz there is NO banding.
When using YCbCr 4:4:4 10/12bit @23Hz there IS banding.
When using RGB 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 8bit @23Hz there is NO banding.

So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz
or
use YCbCr 4:2:2 10/12bit @23Hz and gain back 10bit once the 12bit dithers down or when using 10bit? I don't mind recalibrating since YCbCr is limited only and no full.

At some point, I may work my way backwards in driver versions to find RGB that still offered 8,10, and 12bit choices so I can select 10 and see if banding is present or if I must still select 8bit to cure it. I'm not looking forward to doing that anytime soon though.

Lastly, Allied is a 2160p HDR title. Testing an SDR 1080p title, no amount of anything eliminates or reduces banding. A good example is 47 Meters Down 2017 at scene 42:00
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 22nd March 2018 at 18:24.
brazen1 is offline   Reply With Quote
Old 22nd March 2018, 18:17   #49724  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by brazen1 View Post
I have a couple more things to add since some keen interest is popular at the moment: I have no intentions of introducing any sort of Ad nauseam.

Early drivers offered 8,10, and 12bit. The 10bit option was removed in recent drivers afaik. So now it's just 8bit or 12bit to select.

When using RGB 4:4:4 12bit @23Hz (matching old faithful Allied 2160p HDR test refresh rate) understanding this is a one shot deal since it's going to revert to 8bit after a reboot with newer drivers:

Manni has pointed out his display accepts 12bit and dithers to 10bit correctly with no banding and retains after a reboot. If I'm not mistaken, Manni is also using a custom 3Dlut and not passing HDR through.

It should also be noted, no amount of madVR processing to reduce banding artifacts has any affect at all. Nor does the dithering algorithm used.

When using YCbCr 4:2:2 12bit @23Hz there is NO banding.
When using YCbCr 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 8bit @23Hz there is NO banding.

So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz
or
use YCbCr 4:2:2 12bit @23Hz and gain back 10bit once the 12bit dithers down? I don't mind recalibrating since YCbCr is limited only and no RGB.
Nope, this is not what is happening here.

I use MadVR in HDR passthrough mode, so MadVR sends 10bits, the GPU sends 12 (padded or "interpolated") and the display does no additional dithering because it's a 12bits native projector (12bits path from inputs to panels). So 12bits is actually the best option for me, as my cables are all able to handle the full HDMI 2.0 bandwidth and my display handles 12bits natively.

I do not use a 3DLUT for HDR, I pass it through to the display and I use custom ST2084 gamma curves to display it properly (the native HDR mode of my model is very poor, so I use the Vertex to disable it and I handle the HDR conversion with my own curves). The Metadata goes to the Vertex (and is displayed on its OSD for monitoring/testing when necessary) but it's not sent on in order to prevent the JVC from switching to its crappy HDR mode automatically. The PJ is calibrated to HDR BT2020 when I display 4K23p HDR content. The native HDR mode on more recent JVCs is better than on my model, but still not as good as a few well-designed custom curves (until MadVR's HDR to SDR conversion works better with projectors).

In my case, the best mode for HDR (or SDR) content at 23p is 4K23p RGB 4:4:4 12bits (MadVR dithering to 10bits). For 60p content, it's 4K60p RGB 4:4:4 8bits (MadVR dithering 8bits). For others, it might be different.

I wouldn't use 4:2:2 unless I had to (poor cables, non-optimal display). I'd rather have MadVR chroma upscaling goodness all the way, but that means cables able to handle 445Mhz in 4K23p@12bits and 600Mhz in 4K60p@8bits.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 22nd March 2018 at 18:50.
Manni is offline   Reply With Quote
Old 22nd March 2018, 18:47   #49725  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post

When using YCbCr 4:2:2 10/12bit @23Hz there is NO banding.
When using YCbCr 4:4:4 10/12bit @23Hz there IS banding.
When using RGB 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 8bit @23Hz there is NO banding.

So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz
or
use YCbCr 4:2:2 10/12bit @23Hz and gain back 10bit once the 12bit dithers down or when using 10bit? I don't mind recalibrating since YCbCr is limited only and no full.

Lastly, Allied is a 2160p HDR title. Testing an SDR 1080p title, no amount of anything eliminates or reduces banding. A good example is 47 Meters Down 2017 at scene 42:00
Bit depths were covered yesterday.

Remember, 10-bit RGB > 8-bit RGB > 10-bit YCbCr 4:2:2 > 10-bit YCbCr 4:2:0. That is a quote from madshi.

Given the problem your display has with 12-bit HDR, 8-bit and 10-bit are not equal, as 8-bit is actually better. So your choice is easy. Going from RGB in madVR to YCbCr 4:2:2 would destroy the chroma upscaling done by madVR.

Given that 4:2:2 corrects the HDR issue, maybe your display has trouble with the higher bandwidth of a 12-bit 4:4:4 HDR signal as RGB and YCbCr seem not to matter. It sounds like a poorly-implemented HDR mode by the display. Perhaps, an epic failure by the display if banding is created. For some reason, for one user, Windows OS HDR does a better job. The output from Windows would still be 12-bits HDR, but it doesn't have banding. That sounds like poor tone mapping or gamut mapping.

Does the 1080p title only band at 10-bits? It is not uncommon for 1080p Blu-rays to have some banding in the source.

Last edited by Warner306; 22nd March 2018 at 18:52.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 19:00   #49726  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Got it. I made the wrong assumption about you and a 3Dlut. I'm not familiar with Vertex? I assume it's some sort of expensive additional device add to your HDMI cables to overcome some sort of problem? Anyway, I don't use a Vertex and I don't think I have any need one, and I don't think most people use one. I'm just sort of the common, regular ol' user that uses madVR, an AVR, and a display. I too would like madVR goodness all the way. I'm pretty sure my display is optimal and my cables are of little importance since all of them are no longer than 6' but maybe I'm mistaken. I'm not smart enough to know if your setup that works for you translates into mine should as well even though my setup is more primitive or perhaps in no need of more external configuration like yours. From the expert knowledge I see here, a general rule of thumb is always use RGB full. That limited adds extra processing. Except now I am met with a decision I have no expertise and rely on others here, including you. Which is preferred? RGB full at 8bit only or YCbCr limited at 10bit? I know what you use. I am asking about myself. It comes down to accept banding or live without 10bit or switch to YCbCr limited. I have no idea which of those 3 choices is the wisest?

Warner, let me digest all that. I need to run a few tests and look up some specs. Thank you.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 22nd March 2018 at 19:04.
brazen1 is offline   Reply With Quote
Old 22nd March 2018, 19:03   #49727  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by brazen1 View Post
Got it. I made the wrong assumption about you and a 3Dlut. I'm not familiar with Vertex? I assume it's some sort of expensive additional device add to your HDMI cables to overcome some sort of problem? Anyway, I don't use a Vertex and I don't think I have any need one, and I don't think most people use one. I'm just sort of the common, regular ol' user that uses madVR, an AVR, and a display. I too would like madVR goodness all the way. I'm pretty sure my display is optimal and my cables are of little importance since all of them are no longer than 6' but maybe I'm mistaken. I'm not smart enough to know if your setup that works for you translates into mine should as well even though my setup is more primitive or perhaps in no need of more external configuration like yours. From the expert knowledge I see here, a general rule of thumb is always use RGB full. That limited adds extra processing. Except now I am met with a decision I have no expertise and rely on others here, including you. Which is preferred? RGB full at 8bit only or YCbCr limited at 10bit? I know what you use. I am asking about myself. It comes down to accept banding or live without 10bit or switch to YCbCr limited. I have no idea which of those 3 choices is the wisest?
RGB Full 4:4:4 8bits with MadVR dithering to 8bits in your case. No question.

You don't need a Vertex, and most people don't need one either. It's an advanced diagnostic/testing/HDMI problem solving tool. I was simply explaining my chain as you stated quite a few wrong assumptions.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 22nd March 2018 at 19:05.
Manni is offline   Reply With Quote
Old 22nd March 2018, 20:18   #49728  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Quote:
Originally Posted by mclingo View Post
and native 23,976
But that can still be achieved with custom resolutions, while there is just no way to output 10 bpc over HDMI. I just don't get the reasoning behind allowing 12 bpc and not 10 bpc.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 22nd March 2018, 20:34   #49729  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
it should still be possible to force nvidia to output 10 bit with a custom edid.

and seriously 12 bit processing is not rocket science at all. even an old cheap Phillips TV was able to do 16 bit driver 12 bit GPU and showed no banding at all...
huhn is offline   Reply With Quote
Old 22nd March 2018, 21:09   #49730  |  Link
bran
Registered User
 
Join Date: Jun 2009
Location: Stockholm
Posts: 28
Quote:
Originally Posted by huhn View Post
and seriously 12 bit processing is not rocket science at all. even an old cheap Phillips TV was able to do 16 bit driver 12 bit GPU and showed no banding at all...
For myself, and what I've gathered from various threads - the banding happens in UHD HDR titles with 12bit enabled.
__________________
HTPC: Samsung 65JS9505, Yamaha A2070, Sonus Faber, RX580
bran is offline   Reply With Quote
Old 22nd March 2018, 21:51   #49731  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by bran View Post
For myself, and what I've gathered from various threads - the banding happens in UHD HDR titles with 12bit enabled.
Probably because the displays are not native 12bits or even 10bits displays.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 22nd March 2018, 21:56   #49732  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by bran View Post
For myself, and what I've gathered from various threads - the banding happens in UHD HDR titles with 12bit enabled.
How come the one user has no problems with Windows OS HDR at 12-bits? It must be the tone mapping done by the display.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 21:57   #49733  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Manni View Post
Probably because the displays are not native 12bits or even 10bits displays.
But they don't struggle with 12-bit input if the source is not HDR.

This what I wrote in my set up manual. Someone correct me if I'm wrong.

HDR metadata conversion involves:
Tone Mapping: Compressing highlights to fit the peak luminance of the display;
Gamut Mapping: Mapping the DCI-P3 or Rec. 2020 primaries to the display's visible colors;
Gamma Transfer: Decoding the SMPTE 2084 HDR (PQ) transfer function to the display EOTF.

One of those processes is failing at 12-bits. The display is supposed to know its own peak luminance and color gamut, so it shouldn't fail at this.

Last edited by Warner306; 22nd March 2018 at 22:08.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 22:02   #49734  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
Yeah the whole 10/12 bit marketing thing seems like a big scam to me. Looking at test pattern, my 5 year old monitor can produce better gradients than my 2016 LG Oled.
j82k is offline   Reply With Quote
Old 22nd March 2018, 22:17   #49735  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by j82k View Post
Yeah the whole 10/12 bit marketing thing seems like a big scam to me. Looking at test pattern, my 5 year old monitor can produce better gradients than my 2016 LG Oled.
That's because people don't make the difference between a display able to accept 10/12bits, and a display able to actually reproduce 10/12bits.

One of the many things marketing people love to exploit.

Many displays on the market accept 12bits and have 8bits panels. So what happens in that case is that you ask the source to send 10bits (or more), the display accepts it, then dithers down to its native resolution (most often 8bits or less).

In that case, having MadVR dither to 8bits will most likely produce much better results.

Is this horse dead enough, or does it still needsome beating?
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 22nd March 2018, 22:20   #49736  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by Warner306 View Post
But they don't struggle with 12-bit input if the source is not HDR.
Most manufacturers are still learning how to reproduce HDR10, so this is not surprising.

As I said, on my display the HDR mode is not usable. It has to be disabled to get proper results.

Again, the key is to assess the actual capability of the panels, not just the input, and send what's appropriate after making sure that it is processed well once it reaches the display.

There are lots of ways to achieve this, but they involve test patterns and calibration, which most people can't be bothered with. It's these people that marketing departments targets .
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 22nd March 2018, 22:24   #49737  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
madVR has a native display bitdepth adjustment. We can select 1-9 and the next option is 10 or better.

What if madVR had exclusive adjustments for 10,11,12, and so on? This way no matter what the GPU sent (12 in this case), it would adhere to the display bitdepth limit we select, in this case, 10.

'auto' still selects what the GPU is sending (12) and some of us 10bit native display owners are encountering epic failures because of poorly implemented display modes that don't handle 12bit.

I'd like to enjoy the reasons I purchased HDR equipment. To play my native 10bit sources at 10bit giving me 1024 gradient levels of color. To suggest I digress and just accept 8bit 256 gradients and then blame it on this, that and the other thing isn't very encouraging.

I don't count on drivers producing 10bit RGB. Evidently it's missing for a reason. I'm not going to upgrade to a 12bit native display. (this probably seals my fate) I'd have done that in the first place but considering I had zero interest in what I anticipated would be very limited 12bit dolby vision titles, I adhered to common 10bit HDR10 titles. Now this, that, and the other are affecting me.

Maybe madshi will consider separate 10bit and 12bit settings if it's even possible. Knowing madshi, he probably already did and it wouldn't provide any benefits or remedies and I'm wishful thinking about things I know nothing about again.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 22nd March 2018, 22:38   #49738  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
If I'm right, and Windows is sending the gamut and transfer function to the display to trigger its HDR mode, then the tone mapping must be at fault. I read it on Redfox forums from a user who created a program that emulates what Windows does in HDR mode. This would turn on and off with HDR content with specific media players. I don't know if you should trust the tone mapping by this programmer, but he claims it does the same thing as Windows HDR. Windows gets it right. The display gets it wrong. That's what the original poster said.

It would be useful to have a user with an AMD card and the same display. Then you could determine if its the extra 2-bits that is causing the problem.

I'm sure many users would like to know the answer to this. They don't want to calibrate their PC and display to introduce banding with HDR content. My old plasma handles 12-bit -> 8-bit SDR just fine. I can't notice the dithering, so a set up error wouldn't harm anything.

Last edited by Warner306; 22nd March 2018 at 22:42.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 22:43   #49739  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Manni View Post
Most manufacturers are still learning how to reproduce HDR10, so this is not surprising.

As I said, on my display the HDR mode is not usable. It has to be disabled to get proper results.

Again, the key is to assess the actual capability of the panels, not just the input, and send what's appropriate after making sure that it is processed well once it reaches the display.

There are lots of ways to achieve this, but they involve test patterns and calibration, which most people can't be bothered with. It's these people that marketing departments targets .
Your Vertex doesn't need to trigger the projector's HDR mode to get the gamma right?
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 23:13   #49740  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 589
Quote:
Originally Posted by j82k View Post
Yeah the whole 10/12 bit marketing thing seems like a big scam to me. Looking at test pattern, my 5 year old monitor can produce better gradients than my 2016 LG Oled.
That's completely unsurprising if your "5 year old monitor" has lower contrast than your "2016 LG Oled", which seems extremely likely. The higher the contrast, the more visible banding is.
e-t172 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 20:39.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.