Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
22nd March 2018, 15:41 | #49721 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
When the Nvidia private API is used, the metadata is passed to the display untouched. The display uses its own processing to complete the tone and gamut mapping, not Windows. This would imply your display has issues processing a 10-bit HDR input. It's tone and gamut mapping is not of the highest-quality. This would explain why banding does not show for Manni on his display. I could be wrong, but maybe a display is not the best at handling HDR processing? If true, I should delete my posts from yesterday.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players Last edited by Warner306; 22nd March 2018 at 15:44. |
|
22nd March 2018, 15:42 | #49722 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players Last edited by Warner306; 22nd March 2018 at 15:46. |
|
22nd March 2018, 17:07 | #49723 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
I have a couple more things to add since some keen interest is popular at the moment: I have no intentions of introducing any sort of Ad nauseam.
Early drivers offered 8,10, and 12bit. The 10bit option was removed in recent drivers for RGB afaik. So now it's just 8bit or 12bit to select. When using RGB 4:4:4 12bit @23Hz (matching old faithful Allied 2160p HDR test refresh rate) understanding this is a one shot deal since it's going to revert to 8bit after a reboot with newer drivers: Manni has pointed out his display accepts 12bit and dithers to 10bit correctly with no banding and retains after a reboot. If I'm not mistaken, Manni is also using a custom 3Dlut and not passing HDR through. It should also be noted, no amount of madVR processing to reduce banding artifacts has any affect at all. Nor does the dithering algorithm used. When using YCbCr 4:2:2 10/12bit @23Hz there is NO banding. When using YCbCr 4:4:4 10/12bit @23Hz there IS banding. When using RGB 4:4:4 12bit @23Hz there IS banding. When using RGB 4:4:4 8bit @23Hz there is NO banding. So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz or use YCbCr 4:2:2 10/12bit @23Hz and gain back 10bit once the 12bit dithers down or when using 10bit? I don't mind recalibrating since YCbCr is limited only and no full. At some point, I may work my way backwards in driver versions to find RGB that still offered 8,10, and 12bit choices so I can select 10 and see if banding is present or if I must still select 8bit to cure it. I'm not looking forward to doing that anytime soon though. Lastly, Allied is a 2160p HDR title. Testing an SDR 1080p title, no amount of anything eliminates or reduces banding. A good example is 47 Meters Down 2017 at scene 42:00
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W Last edited by brazen1; 22nd March 2018 at 18:24. |
22nd March 2018, 18:17 | #49724 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
I use MadVR in HDR passthrough mode, so MadVR sends 10bits, the GPU sends 12 (padded or "interpolated") and the display does no additional dithering because it's a 12bits native projector (12bits path from inputs to panels). So 12bits is actually the best option for me, as my cables are all able to handle the full HDMI 2.0 bandwidth and my display handles 12bits natively. I do not use a 3DLUT for HDR, I pass it through to the display and I use custom ST2084 gamma curves to display it properly (the native HDR mode of my model is very poor, so I use the Vertex to disable it and I handle the HDR conversion with my own curves). The Metadata goes to the Vertex (and is displayed on its OSD for monitoring/testing when necessary) but it's not sent on in order to prevent the JVC from switching to its crappy HDR mode automatically. The PJ is calibrated to HDR BT2020 when I display 4K23p HDR content. The native HDR mode on more recent JVCs is better than on my model, but still not as good as a few well-designed custom curves (until MadVR's HDR to SDR conversion works better with projectors). In my case, the best mode for HDR (or SDR) content at 23p is 4K23p RGB 4:4:4 12bits (MadVR dithering to 10bits). For 60p content, it's 4K60p RGB 4:4:4 8bits (MadVR dithering 8bits). For others, it might be different. I wouldn't use 4:2:2 unless I had to (poor cables, non-optimal display). I'd rather have MadVR chroma upscaling goodness all the way, but that means cables able to handle 445Mhz in 4K23p@12bits and 600Mhz in 4K60p@8bits.
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K Last edited by Manni; 22nd March 2018 at 18:50. |
|
22nd March 2018, 18:47 | #49725 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
Remember, 10-bit RGB > 8-bit RGB > 10-bit YCbCr 4:2:2 > 10-bit YCbCr 4:2:0. That is a quote from madshi. Given the problem your display has with 12-bit HDR, 8-bit and 10-bit are not equal, as 8-bit is actually better. So your choice is easy. Going from RGB in madVR to YCbCr 4:2:2 would destroy the chroma upscaling done by madVR. Given that 4:2:2 corrects the HDR issue, maybe your display has trouble with the higher bandwidth of a 12-bit 4:4:4 HDR signal as RGB and YCbCr seem not to matter. It sounds like a poorly-implemented HDR mode by the display. Perhaps, an epic failure by the display if banding is created. For some reason, for one user, Windows OS HDR does a better job. The output from Windows would still be 12-bits HDR, but it doesn't have banding. That sounds like poor tone mapping or gamut mapping. Does the 1080p title only band at 10-bits? It is not uncommon for 1080p Blu-rays to have some banding in the source.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players Last edited by Warner306; 22nd March 2018 at 18:52. |
|
22nd March 2018, 19:00 | #49726 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
Got it. I made the wrong assumption about you and a 3Dlut. I'm not familiar with Vertex? I assume it's some sort of expensive additional device add to your HDMI cables to overcome some sort of problem? Anyway, I don't use a Vertex and I don't think I have any need one, and I don't think most people use one. I'm just sort of the common, regular ol' user that uses madVR, an AVR, and a display. I too would like madVR goodness all the way. I'm pretty sure my display is optimal and my cables are of little importance since all of them are no longer than 6' but maybe I'm mistaken. I'm not smart enough to know if your setup that works for you translates into mine should as well even though my setup is more primitive or perhaps in no need of more external configuration like yours. From the expert knowledge I see here, a general rule of thumb is always use RGB full. That limited adds extra processing. Except now I am met with a decision I have no expertise and rely on others here, including you. Which is preferred? RGB full at 8bit only or YCbCr limited at 10bit? I know what you use. I am asking about myself. It comes down to accept banding or live without 10bit or switch to YCbCr limited. I have no idea which of those 3 choices is the wisest?
Warner, let me digest all that. I need to run a few tests and look up some specs. Thank you.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W Last edited by brazen1; 22nd March 2018 at 19:04. |
22nd March 2018, 19:03 | #49727 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
You don't need a Vertex, and most people don't need one either. It's an advanced diagnostic/testing/HDMI problem solving tool. I was simply explaining my chain as you stated quite a few wrong assumptions.
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K Last edited by Manni; 22nd March 2018 at 19:05. |
|
22nd March 2018, 20:18 | #49728 | Link |
Registered User
Join Date: Oct 2016
Posts: 896
|
But that can still be achieved with custom resolutions, while there is just no way to output 10 bpc over HDMI. I just don't get the reasoning behind allowing 12 bpc and not 10 bpc.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40 |
22nd March 2018, 20:34 | #49729 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,923
|
it should still be possible to force nvidia to output 10 bit with a custom edid.
and seriously 12 bit processing is not rocket science at all. even an old cheap Phillips TV was able to do 16 bit driver 12 bit GPU and showed no banding at all... |
22nd March 2018, 21:51 | #49731 | Link |
Registered User
Join Date: Jul 2014
Posts: 942
|
Probably because the displays are not native 12bits or even 10bits displays.
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K |
22nd March 2018, 21:56 | #49732 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
How come the one user has no problems with Windows OS HDR at 12-bits? It must be the tone mapping done by the display.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
22nd March 2018, 21:57 | #49733 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
This what I wrote in my set up manual. Someone correct me if I'm wrong. HDR metadata conversion involves: Tone Mapping: Compressing highlights to fit the peak luminance of the display; Gamut Mapping: Mapping the DCI-P3 or Rec. 2020 primaries to the display's visible colors; Gamma Transfer: Decoding the SMPTE 2084 HDR (PQ) transfer function to the display EOTF. One of those processes is failing at 12-bits. The display is supposed to know its own peak luminance and color gamut, so it shouldn't fail at this.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players Last edited by Warner306; 22nd March 2018 at 22:08. |
|
22nd March 2018, 22:17 | #49735 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
One of the many things marketing people love to exploit. Many displays on the market accept 12bits and have 8bits panels. So what happens in that case is that you ask the source to send 10bits (or more), the display accepts it, then dithers down to its native resolution (most often 8bits or less). In that case, having MadVR dither to 8bits will most likely produce much better results. Is this horse dead enough, or does it still needsome beating?
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K |
|
22nd March 2018, 22:20 | #49736 | Link | |
Registered User
Join Date: Jul 2014
Posts: 942
|
Quote:
As I said, on my display the HDR mode is not usable. It has to be disabled to get proper results. Again, the key is to assess the actual capability of the panels, not just the input, and send what's appropriate after making sure that it is processed well once it reaches the display. There are lots of ways to achieve this, but they involve test patterns and calibration, which most people can't be bothered with. It's these people that marketing departments targets .
__________________
Win11 Pro x64 b23H2 Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33 madVR/LAV/jRiver/MyMovies/CMC Denon X8500HA>HD Fury VRRoom>TCL 55C805K |
|
22nd March 2018, 22:24 | #49737 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
madVR has a native display bitdepth adjustment. We can select 1-9 and the next option is 10 or better.
What if madVR had exclusive adjustments for 10,11,12, and so on? This way no matter what the GPU sent (12 in this case), it would adhere to the display bitdepth limit we select, in this case, 10. 'auto' still selects what the GPU is sending (12) and some of us 10bit native display owners are encountering epic failures because of poorly implemented display modes that don't handle 12bit. I'd like to enjoy the reasons I purchased HDR equipment. To play my native 10bit sources at 10bit giving me 1024 gradient levels of color. To suggest I digress and just accept 8bit 256 gradients and then blame it on this, that and the other thing isn't very encouraging. I don't count on drivers producing 10bit RGB. Evidently it's missing for a reason. I'm not going to upgrade to a 12bit native display. (this probably seals my fate) I'd have done that in the first place but considering I had zero interest in what I anticipated would be very limited 12bit dolby vision titles, I adhered to common 10bit HDR10 titles. Now this, that, and the other are affecting me. Maybe madshi will consider separate 10bit and 12bit settings if it's even possible. Knowing madshi, he probably already did and it wouldn't provide any benefits or remedies and I'm wishful thinking about things I know nothing about again.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W |
22nd March 2018, 22:38 | #49738 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
If I'm right, and Windows is sending the gamut and transfer function to the display to trigger its HDR mode, then the tone mapping must be at fault. I read it on Redfox forums from a user who created a program that emulates what Windows does in HDR mode. This would turn on and off with HDR content with specific media players. I don't know if you should trust the tone mapping by this programmer, but he claims it does the same thing as Windows HDR. Windows gets it right. The display gets it wrong. That's what the original poster said.
It would be useful to have a user with an AMD card and the same display. Then you could determine if its the extra 2-bits that is causing the problem. I'm sure many users would like to know the answer to this. They don't want to calibrate their PC and display to introduce banding with HDR content. My old plasma handles 12-bit -> 8-bit SDR just fine. I can't notice the dithering, so a set up error wouldn't harm anything.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players Last edited by Warner306; 22nd March 2018 at 22:42. |
22nd March 2018, 22:43 | #49739 | Link | |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Quote:
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|