Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 24th January 2020, 19:29   #58421  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by VBB View Post
But if you know the bit depth, you can certainly test for banding in different desktop and TV modes. And for that, I would recommend disabling dithering in madVR.
But if you disable dithering what do the results of the test tell you? 10 bit will always be better than 8 bit with dithering disabled, even if 8 bit would be better with it enabled. You have to do your testing with the same settings you actually use when watching, with dithering enabled.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2020, 20:31   #58422  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
Quote:
Originally Posted by mrmojo666 View Post
@Mclingo, I'm very curious to follow this bt2020 bug, could you please lnk something regardig the documentation you are referring to?
When I say documented i'm mean the lesser meaning of this, i.e written down, there are no afficial documents detailing this, however the is a bug logged on MADVR here:

http://bugs.madshi.net/view.php?id=630

We moved the discussion about this to the driver thread as we had a complaint about spamming this thread:

https://forum.doom9.org/showthread.php?t=176013&page=50

This issue has dominated the last few pages of this thread, its now pretty clear the API is causing it, we're just waiting for AMD to fix it, here is DMU explanation of the issue:

You have entered support for HDR mode. The agsSetDisplayMode() function used to set a specific display in HDR mode, does its job perfectly: it sends metadata to the display device, which is defined in section 6.9 «Dynamic Range and Mastering InfoFrame» according to Table 5 of the CTA-861 standard. But in the same Table 5 there is also «Auxiliary Video Information (AVI)» defined in section 6.4. And all display devices are required to use the color space (colorimetry) from this data section (AVI InfoFrame) for the current video signal.
Suppose we are in SDR mode with the standard sRGB color space. And we want to switch to the HDR mode with the BT.2020 color space, which is the main one for this mode. By calling the agsSetDisplayMode() function, we put the display device in HDR mode. And we see distorted or unsaturated colors. This is because the display device did not receive the corresponding flag from the GPU in the AVI InfoFrame and is trying to display our BT.2020 color space in its sRGB.
Please tell me, do you think that such HDR support in AGS_SDK is sufficient? If yes, then advise what else needs to be done so that the display device passes into the correct color space when activating the HDR mode using AGS?

The outcome, MADVR is not sending BT2020 to your display so your colours are very unsaturated. Best workaround is to turn on windows HDR before starting your movie player.


*ONLY AFFECTS NAVI BASED CARDS
__________________
LG OLED EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 RX 5700 - https://www.videohelp.com/software/madVR/old-versions

Last edited by mclingo; 24th January 2020 at 20:34.
mclingo is offline   Reply With Quote
Old 24th January 2020, 20:41   #58423  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
Quote:
Originally Posted by Asmodian View Post
But if you disable dithering what do the results of the test tell you? 10 bit will always be better than 8 bit with dithering disabled, even if 8 bit would be better with it enabled. You have to do your testing with the same settings you actually use when watching, with dithering enabled.
You're not testing for bit depth, though. You're trying to find the best combo in regards to banding, and from what I've seen, the best combo is the best with dithering on and off. I guess what I'm trying to say is that I find 4:2:2 10-bit without dithering to be smoother even compared to 4:4:4 or RGB with dithering on. Hope that makes sense.

...and not to confuse anyone here: the above is meant for LG OLEDs only.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex

Last edited by VBB; 24th January 2020 at 21:05.
VBB is offline   Reply With Quote
Old 24th January 2020, 21:48   #58424  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
But you would come to exactly the same conclusion if you left dithering enabled for all your testing. I assume 10 bit 4:2:2 with dithering is still smoother than 4:4:4 or RGB with dithering, if it is smoother without dithering.

Aren't we testing bit depth? 8 v.s. 10?

I am arguing against testing without dithering, it never tells you anything useful about your video path.

Edit: In my testing, on a C9 in PC mode, 10 bit without dithering is better than 8 bit without dithering (for banding) but 8 bit with dithering is better than 10 bit with dithering.
__________________
madVR options explained

Last edited by Asmodian; 24th January 2020 at 22:16.
Asmodian is offline   Reply With Quote
Old 24th January 2020, 22:33   #58425  |  Link
DMU
Registered User
 
Join Date: Dec 2018
Posts: 207
Quote:
Originally Posted by Asmodian View Post
You do get different chroma... whether or not it is worse is more debatable. In testing it does seem better to handle chroma the same as luma but it is one of the least loss of quality options in trade quality for performance.
I was also very interested in this question, so I did some tests.
Pic 1 - "scale chroma separately, if it saves performance" - OFF
Pic 2 - "scale chroma separately, if it saves performance" - ON
As you can see, double conversion of chroma noticeably degrades image quality. Correct me, please, if I did something wrong.
__________________
R3 3200G / Vega8 / Samsung UE40NU7100
Win11Pro 21H2 / 4K RGB 59Hz / AMD last driver
MPC-HC 1.9.17 / madVR 0.92.17 / FSW / SM / 8bit
DMU is offline   Reply With Quote
Old 24th January 2020, 22:45   #58426  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
Quote:
Originally Posted by Asmodian View Post
But you would come to exactly the same conclusion if you left dithering enabled for all your testing. I assume 10 bit 4:2:2 with dithering is still smoother than 4:4:4 or RGB with dithering, if it is smoother without dithering.

Aren't we testing bit depth? 8 v.s. 10?

I am arguing against testing without dithering, it never tells you anything useful about your video path.

Edit: In my testing, on a C9 in PC mode, 10 bit without dithering is better than 8 bit without dithering (for banding) but 8 bit with dithering is better than 10 bit with dithering.
For me it's more about seeing the display's raw performance and what works best with its built-in processing. My final testing is always with dithering on. PC mode is special
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Old 24th January 2020, 22:56   #58427  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
That is the thing though, this myth about "seeing the displays raw performance". 10 bit will always be better. You are not testing your displays raw performance, you are noticing that 10 bit has more steps than 8 bit.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2020, 23:05   #58428  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
But I'm not even bringing up 10-bit. You did I wholeheartedly agree that madVR's 8-bit dithering is indistinguishable from 10-bit. It was merely a coincidence that I found the best mode for me happens to be 10-bit. I would have happily picked 8-bit if it was better in this case.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Old 24th January 2020, 23:15   #58429  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
This was a discussion in response to nsnhd about turning off dithering and noticing that 10 bit was smoother than 8 bit and thinking that it meant something.

Edit: You decided YCbCr 10 bit 422 looked better than YCbCr 8 bit 422 with dithering disabled or enabled? There is an implicit choice between 8 and 10 bit in your preferred mode.
__________________
madVR options explained

Last edited by Asmodian; 24th January 2020 at 23:20.
Asmodian is offline   Reply With Quote
Old 24th January 2020, 23:21   #58430  |  Link
mrmojo666
Registered User
 
Join Date: Jan 2017
Posts: 107
@mclingo, thank you for the explanation, I'm going to read links
__________________
AMD Ry 1500x - 8GB - RX460 4GB
TV Philips 55pus6501+ Marantz 1608 avr
WIN10(1903) 4K/444RGB
Mediaportal - Mpc-hc
MADVR-D3D11/10bit
mrmojo666 is offline   Reply With Quote
Old 24th January 2020, 23:40   #58431  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
Quote:
Originally Posted by Asmodian View Post
This was a discussion in response to nsnhd about turning off dithering and noticing that 10 bit was smoother than 8 bit and thinking that it meant something.

Edit: You decided YCbCr 10 bit 422 looked better than YCbCr 8 bit 422 with dithering disabled or enabled? There is an implicit choice between 8 and 10 bit in your preferred mode.
That's why I said in response to nsnhd's post that there is no good way to test for bit depth. Instead, you can test for banding. And again, my final choice for using 10-bit wasn't based purely on bit depth, but with 4:2:2, 10-bit is banding-free while 8-bit is far from it. I did not find the same to be true using any other combo. 8-bit always wins (compared to 12-bit, when there is a choice).

Maybe I should rephrase: When testing for banding, whether you do it with or without dithering, your final choice should always be based on what looks best the way you actually watch it: with dithering enabled.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex

Last edited by VBB; 24th January 2020 at 23:47.
VBB is offline   Reply With Quote
Old 25th January 2020, 00:54   #58432  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Ok, totally agree.

But I still think testing with madVR's and/or the GPU's dithering disabled is pointless, except academically.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 25th January 2020, 05:44   #58433  |  Link
nsnhd
Registered User
 
Join Date: Jul 2016
Posts: 130
After your discussions I've redone the testing, from now on madVR's dithering is always enabled (Ordered Dithering).
When GPU output is set to 10bit RGB Full in NCP, no difference between 8bit/10bit outputs from madVR FSW can be seen by playing gradient files.
But, when GPU is set to 8bit and of course madVR output FSW also changed to 8bit, I can now see more banding compared to the above 10bit GPU setting.

So, what can I know about my monitor now, or still nothing ?
nsnhd is offline   Reply With Quote
Old 25th January 2020, 06:47   #58434  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
if the test is done correctly this would mean your screen is incorrectly handling 8 bit input so it will add additional banding for pretty much every application.

the current stable version of madVR shouldn't stop you from sending 10 bit even when the GPU is set to 8 bit output actually there is quite a good reason to still do that but i guess madshi wants to work around an nvidia driver bug i haven't checked in a log time.

you should use full screen exclusive for this test and not windowed full screen because WFS is know to produce a lot of banding when the GPU outputs 8 bit but get's a 10 bit input.
huhn is offline   Reply With Quote
Old 25th January 2020, 10:59   #58435  |  Link
glc650
Registered User
 
Join Date: May 2003
Posts: 77
I've noticed with 23.976 HDR content played back at 23.976 sections of the screen are extra blurry (almost like they are pulsating/flickering) when panning (eg. fence or branches/leaves on a tree in background). If I play back at 59.940 (with smooth motion off - with it on the issue is still there) this almost goes away.

If I turn off HDR (tone map hdr using pixel shaders), this issue is completely gone at any frame rate and with or without smooth motion.

This happens even after restoring default madvr settings.

Last edited by glc650; 25th January 2020 at 11:04.
glc650 is offline   Reply With Quote
Old 27th January 2020, 13:13   #58436  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
Quote:
Originally Posted by VBB View Post
If you're unsure whether your display is 8-bit or 10-bit, there really isn't a test to determine that 100% anyway. But if you know the bit depth, you can certainly test for banding in different desktop and TV modes. And for that, I would recommend disabling dithering in madVR. You still can't be sure that the video card itself doesn't do any dithering, but it's a start. Just make sure to turn it back on when you're done testing.
I understand what @VBB does and why, and I agree with him.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 27th January 2020, 17:15   #58437  |  Link
jasonwc18
Registered User
 
Join Date: Jan 2018
Posts: 20
I have my JVC RS600 calibrated for both BT.709 and BT.2020. In the past, I've accomplished this by using 3DLUTs for both color spaces, and madvr will automatically select the correct 3DLUT based on the video metadata. However, I upgraded to a Stewart ST130 G4 screen and the manual calibration result was so good, there's no real need for a 3DLUT. If I select "Disable calibration controls for this display", will that allow me to switch between BT.2020 and BT.709 content by just selecting the appropriate mode on my PJ? I thought that mode was equivalent to BT.709, Gamma Power 2.2.
jasonwc18 is offline   Reply With Quote
Old 28th January 2020, 04:53   #58438  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
no it will not.

in sdr output "Disable calibration controls for this display" is pretty much bt 709 gamma 2.2.
huhn is offline   Reply With Quote
Old 28th January 2020, 12:30   #58439  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by jasonwc18 View Post
I have my JVC RS600 calibrated for both BT.709 and BT.2020. In the past, I've accomplished this by using 3DLUTs for both color spaces, and madvr will automatically select the correct 3DLUT based on the video metadata. However, I upgraded to a Stewart ST130 G4 screen and the manual calibration result was so good, there's no real need for a 3DLUT. If I select "Disable calibration controls for this display", will that allow me to switch between BT.2020 and BT.709 content by just selecting the appropriate mode on my PJ? I thought that mode was equivalent to BT.709, Gamma Power 2.2.
It depends if you're using the same calibration for both, and which device you're using.

Unfortunately the rs600 on its own is unable to detect SDR BT2020 vs rec-709 and switch automatically between a rec-709 and a BT2020 calibration. It's only able to detect HDR10 and apply its awful gamma D curve, which you definitely don't want.

So you have three ways to use madVR to automatically display the content correctly:

1) You're using an SDR BT2020 calibration for all content. This means that unless you need your iris fully open even for SDR content, you are most likely sacrificing black floor / contrast in SDR rec-709 to privilege brightness in HDR, running all with the iris fully open and using the P3 filter to get a wide gamut. In that case, assuming your two 3D LUTs use this same BT2020 baseline, madVR can switch between each LUT automatically. If you don't want to use the LUT anymore, simply specify "this display is already calibrated to"in the calibration tab, and specify SDR BT2020 and whichever gamma you've used in your calibration (usually 2.4, but it can be anything, as long as your target for the 3D LUT is 2.2 if you're using one for HDR tonemapping, though in that case it's recommended to target P3 rather than BT2020 to avoid posterization). madVR will automatically convert rec-709 content so that it displays correctly with your BT2020 calibration. There is no need for profiles.

2) You're using an SDR Rec709 calibration for all content. This means that you are most likey sacrificing brightness and gamut cover in HDR in order to maximise black floor / native contrast, not using the filter (unless you can't reach 709 wihtout it) and closing the iris further to reach 50/60nits in sdr rec-709 (or brighter if you don't have a dedicated room). If you don't need the 3D LUTs anymore, you simply specify "this display is already calibrated to" and you specify rec-709 and whichever gamma the display is actually calibrated. MadVR will automatically tonemap BT2020 content to fit into your rec-709 calibration. There is no need for profiles.

3) You don't want to compromise either content type, so you have two calibrations on the JVC (rec-709, no filter, iris closed to get best black floor / native contrast and around 50-60nits peak) and SDR BT2020 (2020, P3 filter, iris open to get the highest peak brightness, hence headroom for HDR highlights, at the cost of a higher black floor / reduced native contrast). You will need profiles to tell madVR which calibration you're using for each content type. If you're not using 3D LUTs, you can specify "this display is already calibrated" to either rec-709 or BT2020 using a different profile for each content. In that case, you can either switch manually between the two calibrations on the JVC, or if you have an nVidia GPU and a Vertex/Maestro/Diva you can select the "report BT2020" box in the calibration tab for your BT2020 profile, and make sure it's unchecked in your rec-709 profile. The HD fury device will detect if the content is rec-709 or BT2020 according to the flag, and provided you have installed an rs-232 cable and programmed the device accordingly, it will select the correct calibration automatically, according to content. MadVR will select the correct profile according to content, and will know which calibration you're using thanks to each profile. See here for more info: https://www.avsforum.com/forum/24-di...l#post55408090

There are other ways to achieve this using batch files instead of an HD Fury device but they are even more complicated and don't work reliably in my experience.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 28th January 2020 at 12:34.
Manni is offline   Reply With Quote
Old 28th January 2020, 16:41   #58440  |  Link
Wilmar
Registered User
 
Wilmar's Avatar
 
Join Date: Jan 2020
Posts: 3
Hi everyone!
I asked a question about not working HDR some time ago but it seems to got buried under the discussion above, so I decided to up it
Here’s the issue:
I can’t make HDR work properly with madVR and PotPlayer + 4K HDR-capable TV.
I enabled HDR in madVR settings – screenshot.
My MadVR settings config is here.
The TV says it’s getting HDR image. However, the colours don’t look HDR at all. Moreover, when played via the TV directly, the video looks properly. But playing it via the PC results in faded, dull and dark image.
I tried enabling and disabling Windows HDR – to no avail.
Here’s what PotPlayer says about the video with Windows HDR enabled and disabled respectively.
No Windows HDR
Windows HDR

My configuration:
Windows 10 64 bit 1903
Nvidia Geforce 1080Ti (Driver ver. 441.87)
MadVR ver. 0.92.17
PotPlayer 1.7.18346

clsid suggested that I disable the internal video decoder and video processor in Potplayer and use LAV Video decoder. However, I already use it as seen on the screenshots:
https://i.imgur.com/HZOJZpJ.png
https://i.imgur.com/1MmvsgE.png
My exported Potplayer settings are here.

Am I doing something wrong?
Thanks.
Wilmar is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:55.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.