Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 22nd March 2018, 10:35   #49761  |  Link
bran
Registered User
 
Join Date: Jun 2009
Location: Stockholm
Posts: 28
As I've said in previous posts, there is a lot more noticeable banding when sending "12-bit" to current-gen TVs compared to 10bit. Regardless of madVR or standalone UHD-players. So padded zeros or not - the result is more banding.

For example check out threads regarding Panasonics UB700/900 players - people were getting banding until Pana released new firmware enabling 10bit.

I myself got a lot less banding outputting 10bit from AMD compared to nvidias 12bit (Samsung JS9505).
__________________
HTPC: Samsung 65JS9505, Yamaha A2070, Sonus Faber, RX580
bran is offline   Reply With Quote
Old 22nd March 2018, 11:38   #49762  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 2,796
So.. maybe if Nvidia had a 10bit mode we'd be peachy?
ryrynz is offline   Reply With Quote
Old 22nd March 2018, 11:42   #49763  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,289
Or those crappy TVs should stop accepting 12-bit if they can't properly deal with it, then.
The fault clearly lies with the device that advertises support for a mode that it can't process properly.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 22nd March 2018 at 11:46.
nevcairiel is offline   Reply With Quote
Old 22nd March 2018, 11:49   #49764  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 333
Quote:
Originally Posted by nevcairiel View Post
Or those crappy TVs should stop accepting 12-bit if they can't properly deal with it, then.
The fault clearly lies with the device that advertises support for a mode that it can't process properly.
This. So much this.

I long for a world where companies open a special section of their website, dedicated to enthusiasts, with all information relative to what's necessary to defeat the so-called "enhancements" that they introduce to satisfy undiscerning eyes.
ashlar42 is offline   Reply With Quote
Old 22nd March 2018, 13:52   #49765  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 482
Quote:
Originally Posted by nevcairiel View Post
Or those crappy TVs should stop accepting 12-bit if they can't properly deal with it, then.
The fault clearly lies with the device that advertises support for a mode that it can't process properly.
Sure, but that doesn’t mean that projectors like my JVC which do support 12bits from input to panel shouldn’t benefit from it.

Zero banding in the Allied clip here, and same results with NV HDR and OS HDR.

What is sure is that people should test whether their display supports 12bits or not before enabling it. 8bits is a safer option unless 12bits support is confirmed.
__________________
Win 10 Pro x64 V1803 MCE add-on
i7 3770K@4.2Ghz 16Gb@2.1Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 4:4:4 12bits
MPC-BE / LAV / MadVR / MyMovies V5.24
Denon X8500H>Vertex>JVC RS500/X7000
Manni is offline   Reply With Quote
Old 22nd March 2018, 14:03   #49766  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 268
Quote:
Originally Posted by nevcairiel View Post
The fault clearly lies with the device that advertises support for a mode that it can't process properly.
While you're right on TVs, I still don't get why NVIDIA won't give the option of 10-bit HDMI output.
__________________
HTPC: E7400, GeForce 1050 Ti, DVB-C TV, Panasonic GT60 | Desktop: 4690K, Radeon 7870, Dell U2713HM | Windows 1709, MediaPortal/MPC-HC, LAV Filters, ReClock, madVR | Laptop: i5-2520M, Windows Insider
el Filou is offline   Reply With Quote
Old 22nd March 2018, 15:14   #49767  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 419
and native 23,976
mclingo is offline   Reply With Quote
Old 22nd March 2018, 15:29   #49768  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 443
The solution here is Nvidia offering 10bit and 12bit support in divers for TV's that support both bit depths.
Razoola is offline   Reply With Quote
Old 22nd March 2018, 15:41   #49769  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 751
Quote:
Originally Posted by BatKnight View Post
I believe that when playing a 2160p 10bit HDR 23.976fps video and having madVR set up to output 10 bit, that all is going the best way possible.
What I don't understand is why when sending the HDR metadata, that results on NV HDR, I get banding and when not sending the HDR metadata, and manually enabling the OS HDR, I get perfect image and no banding... Why is OS HDR behaving differently than NV HDR for the same settings?
I think the problem here is the metadata. When the OS HDR toggle is enabled, it is my understanding Windows sends the color gamut and transfer function to the display, but not the metadata. The gamut and transfer function is enough for the display to enter its HDR mode. Tone and gamut mapping are done at the PC level. In fact, in the next update, Windows is releasing a calibration tool to change how HDR looks on your display. This is why HDR is always on.

When the Nvidia private API is used, the metadata is passed to the display untouched. The display uses its own processing to complete the tone and gamut mapping, not Windows. This would imply your display has issues processing a 10-bit HDR input. It's tone and gamut mapping is not of the highest-quality. This would explain why banding does not show for Manni on his display.

I could be wrong, but maybe a display is not the best at handling HDR processing? If true, I should delete my posts from yesterday.

Last edited by Warner306; 22nd March 2018 at 15:44.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 15:42   #49770  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 751
Quote:
Originally Posted by bran View Post
As I've said in previous posts, there is a lot more noticeable banding when sending "12-bit" to current-gen TVs compared to 10bit. Regardless of madVR or standalone UHD-players. So padded zeros or not - the result is more banding.

For example check out threads regarding Panasonics UB700/900 players - people were getting banding until Pana released new firmware enabling 10bit.

I myself got a lot less banding outputting 10bit from AMD compared to nvidias 12bit (Samsung JS9505).
But he said most 12-bit input is fine. It is only HDR that is the problem. And 12-bit with the OS HDR toggle is also fine. It has something to do with the HDR tone mapping.

Last edited by Warner306; 22nd March 2018 at 15:46.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 17:07   #49771  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 140
I have a couple more things to add since some keen interest is popular at the moment: I have no intentions of introducing any sort of Ad nauseam.

Early drivers offered 8,10, and 12bit. The 10bit option was removed in recent drivers for RGB afaik. So now it's just 8bit or 12bit to select.

When using RGB 4:4:4 12bit @23Hz (matching old faithful Allied 2160p HDR test refresh rate) understanding this is a one shot deal since it's going to revert to 8bit after a reboot with newer drivers:

Manni has pointed out his display accepts 12bit and dithers to 10bit correctly with no banding and retains after a reboot. If I'm not mistaken, Manni is also using a custom 3Dlut and not passing HDR through.

It should also be noted, no amount of madVR processing to reduce banding artifacts has any affect at all. Nor does the dithering algorithm used.

When using YCbCr 4:2:2 10/12bit @23Hz there is NO banding.
When using YCbCr 4:4:4 10/12bit @23Hz there IS banding.
When using RGB 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 8bit @23Hz there is NO banding.

So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz
or
use YCbCr 4:2:2 10/12bit @23Hz and gain back 10bit once the 12bit dithers down or when using 10bit? I don't mind recalibrating since YCbCr is limited only and no full.

At some point, I may work my way backwards in driver versions to find RGB that still offered 8,10, and 12bit choices so I can select 10 and see if banding is present or if I must still select 8bit to cure it. I'm not looking forward to doing that anytime soon though.

Lastly, Allied is a 2160p HDR title. Testing an SDR 1080p title, no amount of anything eliminates or reduces banding. A good example is 47 Meters Down 2017 at scene 42:00
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1803 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D

Last edited by brazen1; 22nd March 2018 at 18:24.
brazen1 is offline   Reply With Quote
Old 22nd March 2018, 18:17   #49772  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 482
Quote:
Originally Posted by brazen1 View Post
I have a couple more things to add since some keen interest is popular at the moment: I have no intentions of introducing any sort of Ad nauseam.

Early drivers offered 8,10, and 12bit. The 10bit option was removed in recent drivers afaik. So now it's just 8bit or 12bit to select.

When using RGB 4:4:4 12bit @23Hz (matching old faithful Allied 2160p HDR test refresh rate) understanding this is a one shot deal since it's going to revert to 8bit after a reboot with newer drivers:

Manni has pointed out his display accepts 12bit and dithers to 10bit correctly with no banding and retains after a reboot. If I'm not mistaken, Manni is also using a custom 3Dlut and not passing HDR through.

It should also be noted, no amount of madVR processing to reduce banding artifacts has any affect at all. Nor does the dithering algorithm used.

When using YCbCr 4:2:2 12bit @23Hz there is NO banding.
When using YCbCr 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 8bit @23Hz there is NO banding.

So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz
or
use YCbCr 4:2:2 12bit @23Hz and gain back 10bit once the 12bit dithers down? I don't mind recalibrating since YCbCr is limited only and no RGB.
Nope, this is not what is happening here.

I use MadVR in HDR passthrough mode, so MadVR sends 10bits, the GPU sends 12 (padded or "interpolated") and the display does no additional dithering because it's a 12bits native projector (12bits path from inputs to panels). So 12bits is actually the best option for me, as my cables are all able to handle the full HDMI 2.0 bandwidth and my display handles 12bits natively.

I do not use a 3DLUT for HDR, I pass it through to the display and I use custom ST2084 gamma curves to display it properly (the native HDR mode of my model is very poor, so I use the Vertex to disable it and I handle the HDR conversion with my own curves). The Metadata goes to the Vertex (and is displayed on its OSD for monitoring/testing when necessary) but it's not sent on in order to prevent the JVC from switching to its crappy HDR mode automatically. The PJ is calibrated to HDR BT2020 when I display 4K23p HDR content. The native HDR mode on more recent JVCs is better than on my model, but still not as good as a few well-designed custom curves (until MadVR's HDR to SDR conversion works better with projectors).

In my case, the best mode for HDR (or SDR) content at 23p is 4K23p RGB 4:4:4 12bits (MadVR dithering to 10bits). For 60p content, it's 4K60p RGB 4:4:4 8bits (MadVR dithering 8bits). For others, it might be different.

I wouldn't use 4:2:2 unless I had to (poor cables, non-optimal display). I'd rather have MadVR chroma upscaling goodness all the way, but that means cables able to handle 445Mhz in 4K23p@12bits and 600Mhz in 4K60p@8bits.
__________________
Win 10 Pro x64 V1803 MCE add-on
i7 3770K@4.2Ghz 16Gb@2.1Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 4:4:4 12bits
MPC-BE / LAV / MadVR / MyMovies V5.24
Denon X8500H>Vertex>JVC RS500/X7000

Last edited by Manni; 22nd March 2018 at 18:50.
Manni is offline   Reply With Quote
Old 22nd March 2018, 18:47   #49773  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 751
Quote:
Originally Posted by brazen1 View Post

When using YCbCr 4:2:2 10/12bit @23Hz there is NO banding.
When using YCbCr 4:4:4 10/12bit @23Hz there IS banding.
When using RGB 4:4:4 12bit @23Hz there IS banding.
When using RGB 4:4:4 8bit @23Hz there is NO banding.

So, is it better to lose 10bit using RGB 4:4:4 8bit @23Hz
or
use YCbCr 4:2:2 10/12bit @23Hz and gain back 10bit once the 12bit dithers down or when using 10bit? I don't mind recalibrating since YCbCr is limited only and no full.

Lastly, Allied is a 2160p HDR title. Testing an SDR 1080p title, no amount of anything eliminates or reduces banding. A good example is 47 Meters Down 2017 at scene 42:00
Bit depths were covered yesterday.

Remember, 10-bit RGB > 8-bit RGB > 10-bit YCbCr 4:2:2 > 10-bit YCbCr 4:2:0. That is a quote from madshi.

Given the problem your display has with 12-bit HDR, 8-bit and 10-bit are not equal, as 8-bit is actually better. So your choice is easy. Going from RGB in madVR to YCbCr 4:2:2 would destroy the chroma upscaling done by madVR.

Given that 4:2:2 corrects the HDR issue, maybe your display has trouble with the higher bandwidth of a 12-bit 4:4:4 HDR signal as RGB and YCbCr seem not to matter. It sounds like a poorly-implemented HDR mode by the display. Perhaps, an epic failure by the display if banding is created. For some reason, for one user, Windows OS HDR does a better job. The output from Windows would still be 12-bits HDR, but it doesn't have banding. That sounds like poor tone mapping or gamut mapping.

Does the 1080p title only band at 10-bits? It is not uncommon for 1080p Blu-rays to have some banding in the source.

Last edited by Warner306; 22nd March 2018 at 18:52.
Warner306 is offline   Reply With Quote
Old 22nd March 2018, 19:00   #49774  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 140
Got it. I made the wrong assumption about you and a 3Dlut. I'm not familiar with Vertex? I assume it's some sort of expensive additional device add to your HDMI cables to overcome some sort of problem? Anyway, I don't use a Vertex and I don't think I have any need one, and I don't think most people use one. I'm just sort of the common, regular ol' user that uses madVR, an AVR, and a display. I too would like madVR goodness all the way. I'm pretty sure my display is optimal and my cables are of little importance since all of them are no longer than 6' but maybe I'm mistaken. I'm not smart enough to know if your setup that works for you translates into mine should as well even though my setup is more primitive or perhaps in no need of more external configuration like yours. From the expert knowledge I see here, a general rule of thumb is always use RGB full. That limited adds extra processing. Except now I am met with a decision I have no expertise and rely on others here, including you. Which is preferred? RGB full at 8bit only or YCbCr limited at 10bit? I know what you use. I am asking about myself. It comes down to accept banding or live without 10bit or switch to YCbCr limited. I have no idea which of those 3 choices is the wisest?

Warner, let me digest all that. I need to run a few tests and look up some specs. Thank you.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W10v1803 X5690 9604GB RGB 4:4:4 8bit Desktop @60Hz 8,10,12bit @Matched Refresh Rates
KODI MPC-HC/BE PDVD DVDFab
65JS8500 UHD HDR 3D

Last edited by brazen1; 22nd March 2018 at 19:04.
brazen1 is offline   Reply With Quote
Old 22nd March 2018, 19:03   #49775  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 482
Quote:
Originally Posted by brazen1 View Post
Got it. I made the wrong assumption about you and a 3Dlut. I'm not familiar with Vertex? I assume it's some sort of expensive additional device add to your HDMI cables to overcome some sort of problem? Anyway, I don't use a Vertex and I don't think I have any need one, and I don't think most people use one. I'm just sort of the common, regular ol' user that uses madVR, an AVR, and a display. I too would like madVR goodness all the way. I'm pretty sure my display is optimal and my cables are of little importance since all of them are no longer than 6' but maybe I'm mistaken. I'm not smart enough to know if your setup that works for you translates into mine should as well even though my setup is more primitive or perhaps in no need of more external configuration like yours. From the expert knowledge I see here, a general rule of thumb is always use RGB full. That limited adds extra processing. Except now I am met with a decision I have no expertise and rely on others here, including you. Which is preferred? RGB full at 8bit only or YCbCr limited at 10bit? I know what you use. I am asking about myself. It comes down to accept banding or live without 10bit or switch to YCbCr limited. I have no idea which of those 3 choices is the wisest?
RGB Full 4:4:4 8bits with MadVR dithering to 8bits in your case. No question.

You don't need a Vertex, and most people don't need one either. It's an advanced diagnostic/testing/HDMI problem solving tool. I was simply explaining my chain as you stated quite a few wrong assumptions.
__________________
Win 10 Pro x64 V1803 MCE add-on
i7 3770K@4.2Ghz 16Gb@2.1Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 4:4:4 12bits
MPC-BE / LAV / MadVR / MyMovies V5.24
Denon X8500H>Vertex>JVC RS500/X7000

Last edited by Manni; 22nd March 2018 at 19:05.
Manni is offline   Reply With Quote
Old 22nd March 2018, 20:18   #49776  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 268
Quote:
Originally Posted by mclingo View Post
and native 23,976
But that can still be achieved with custom resolutions, while there is just no way to output 10 bpc over HDMI. I just don't get the reasoning behind allowing 12 bpc and not 10 bpc.
__________________
HTPC: E7400, GeForce 1050 Ti, DVB-C TV, Panasonic GT60 | Desktop: 4690K, Radeon 7870, Dell U2713HM | Windows 1709, MediaPortal/MPC-HC, LAV Filters, ReClock, madVR | Laptop: i5-2520M, Windows Insider
el Filou is offline   Reply With Quote
Old 22nd March 2018, 20:34   #49777  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 4,741
it should still be possible to force nvidia to output 10 bit with a custom edid.

and seriously 12 bit processing is not rocket science at all. even an old cheap Phillips TV was able to do 16 bit driver 12 bit GPU and showed no banding at all...
huhn is offline   Reply With Quote
Old 22nd March 2018, 21:09   #49778  |  Link
bran
Registered User
 
Join Date: Jun 2009
Location: Stockholm
Posts: 28
Quote:
Originally Posted by huhn View Post
and seriously 12 bit processing is not rocket science at all. even an old cheap Phillips TV was able to do 16 bit driver 12 bit GPU and showed no banding at all...
For myself, and what I've gathered from various threads - the banding happens in UHD HDR titles with 12bit enabled.
__________________
HTPC: Samsung 65JS9505, Yamaha A2070, Sonus Faber, RX580
bran is offline   Reply With Quote
Old 22nd March 2018, 21:51   #49779  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 482
Quote:
Originally Posted by bran View Post
For myself, and what I've gathered from various threads - the banding happens in UHD HDR titles with 12bit enabled.
Probably because the displays are not native 12bits or even 10bits displays.
__________________
Win 10 Pro x64 V1803 MCE add-on
i7 3770K@4.2Ghz 16Gb@2.1Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 385.28 RGB Full 4:4:4 12bits
MPC-BE / LAV / MadVR / MyMovies V5.24
Denon X8500H>Vertex>JVC RS500/X7000
Manni is offline   Reply With Quote
Old 22nd March 2018, 21:56   #49780  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 751
Quote:
Originally Posted by bran View Post
For myself, and what I've gathered from various threads - the banding happens in UHD HDR titles with 12bit enabled.
How come the one user has no problems with Windows OS HDR at 12-bits? It must be the tone mapping done by the display.
Warner306 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 12:52.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2018, vBulletin Solutions Inc.