Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 9th July 2019, 23:39   #56801  |  Link
mambans
Registered User
 
Join Date: Jun 2019
Location: Sweden
Posts: 28
Quick question. I know fullscreen exclusive is a bit buggy with HDR but is there anyone who have gotten it to work?

My won't enter exclusive mode so I still get presentation glitches.
mambans is offline   Reply With Quote
Old 10th July 2019, 16:07   #56802  |  Link
jespermart
Registered User
 
Join Date: Mar 2018
Posts: 22
Quote:
Originally Posted by seiyafan View Post
Anyone got Navi yet? Curious to know how it performs. =)
It performs great. I just changed my Rtx 2010 ti out for a 5700 xt and no more problems with stuttering only issue 3d mvc has to been run in exclusive mode.
Settings are almost identical i have lowered ngu very high settings in my 4k profile to high I didn't have to change anything in my other profiles
jespermart is offline   Reply With Quote
Old 10th July 2019, 23:51   #56803  |  Link
jasonwc18
Registered User
 
Join Date: Jan 2018
Posts: 20
Most recent Nvidia driver that supports Custom Modes w/ 12 bit color?

The latest Nvidia 431.36 driver will only provide 8 bit color output from a custom resolution mode. This appears to be a driver bug regression as I was using the same graphics card and projector (GTX 1070 and a JVC RS600) with a much older driver and got 12 bit output from my custom mode. However, I recently added a 65" OLED in the back of my theater for gaming and HDR, and upgraded the HTPC to a Ryzen 3600 to accommodate the gaming demand. Everything is the same as before except I no longer have the option to set 12 bit color for any custom mode, regardless of resolution or Hz setting.

In contrast, for the standard mode, I can select 24 bit color for 2160p23, 24, 25, 29 and 30Hz modes. If set to 59 or 60Hz, it'll use 8 bit color (due to the limitation of HDMI 2.0) but will revert back to 12 bit color when using 23, 24, 25, 29, or 30 Hz modes.

If I wasn't gaming, I would just go back to my ancient Nvidia driver. However, I would like a relatively new driver since I'll also be gaming on this system. So, can anyone tell me what recent driver versions still work with 12 bit color using custom modes?

Observation: Now that I"m using the same HTPC to connect to two display devices (JVC RS600 projector and a 65" LG OLED), I discovered that the Nvidia driver only saves a single custom mode per resolution/Hz. It's not independent per device. As such, if you are going to connect more than one device, the same mode must work on all devices. I learned this the hard way. :/
jasonwc18 is offline   Reply With Quote
Old 11th July 2019, 00:49   #56804  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
None of the new ones support >8 bit for custom modes.

However, more than 8 bit is unimportant when using madVR. The only impact is more noise in the image, does 8 bit look too noisy to you? If you get someone to do a blind test with you can you tell which is which?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 11th July 2019, 01:53   #56805  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
@jasonwc18 Just make a Madvr partition.

I have entirely separate system partitions for various programs that I NEEED to work.

I have separate custom partition for Starcraft Remastered/ Solidworks 14/ Solidworks 18 / Photoshop and Madvr.

And I also have multiple computers, and multiple monitors at the same desk

This is the ghetto way to fix things, by just throwing more computers at it, But this guarantees Low-Contamination, and Everything works.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 11th July 2019, 03:15   #56806  |  Link
jasonwc18
Registered User
 
Join Date: Jan 2018
Posts: 20
Quote:
Originally Posted by Asmodian View Post
None of the new ones support >8 bit for custom modes.

However, more than 8 bit is unimportant when using madVR. The only impact is more noise in the image, does 8 bit look too noisy to you? If you get someone to do a blind test with you can you tell which is which?
What about for native 10 bit sources (4K HDR Blu-Rays)? That was my concern. I assumed it wouldn't matter for 8 bit content.

For those curious, I tested 416.81, 417.71, and 418.81 (DDU uninstall between each) and none was able to do 12 bit color with a custom mode. I chose those versions because people in this thread reported it worked - but on an earlier version of Windows 10 (1803; I'm using 1903 for the Ryzen scheduling update). However, MS's default Nvidia driver, 388.43 (IIRC) does work. Unfortunately, it's from Nov 2017.

Last edited by jasonwc18; 11th July 2019 at 03:21.
jasonwc18 is offline   Reply With Quote
Old 11th July 2019, 03:25   #56807  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
8 bit is fine for UHD. With madvr and a good dithering algorithm choice, you won't notice any difference whatsoever. A bunch of us use 8 bit by choice because of banding issues with our LG OLED panels. I'm not (yet) doing a custom res on mine so I can't help there, but, as far as 8 bit goes, it's really not an issue with madvr.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 11th July 2019, 03:33   #56808  |  Link
jasonwc18
Registered User
 
Join Date: Jan 2018
Posts: 20
Quote:
Originally Posted by SamuriHL View Post
8 bit is fine for UHD. With madvr and a good dithering algorithm choice, you won't notice any difference whatsoever. A bunch of us use 8 bit by choice because of banding issues with our LG OLED panels. I'm not (yet) doing a custom res on mine so I can't help there, but, as far as 8 bit goes, it's really not an issue with madvr.
That's great to hear. I will also using an LG OLED for HDR, so if it's fine on that, 8 bit isn't a problem. I am using Error Diffusion, Option 1 for dithering.

Last edited by jasonwc18; 11th July 2019 at 03:35.
jasonwc18 is offline   Reply With Quote
Old 11th July 2019, 03:47   #56809  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by jasonwc18 View Post
What about for native 10 bit sources (4K HDR Blu-Rays)? That was my concern. I assumed it wouldn't matter for 8 bit content.
Even for 10 bit sources. As soon as you send the video to madVR it is converted to 16 bit data (4:2:0 YCbCr -> RGB). During final presentation madVR dithers this to anything from 1 to 10 bit, as configured. The lower the bit depth the higher the noise. The difference between well dithered 8 and 10 bit output is basically undetectable, the increased noise is very subtle. Try comparing 6 bit to 8 bit output to see a stronger version of the effect. 6 bit compared to 8 bit is many times more significant than 8 bit compared to 10 bit but it is still pretty subtle. Dithering is remarkably powerful, another interesting test is to set madVR to 4 bit output and compare ordered dithering to dithering disabled.

It is different for lossily compressed content (e.g. everything we watch), this is because the dithering cannot be completely preserved by the lossy compression. 10 bit is important for HDR blurays on the disc, just not when we are watching them.

Some displays actually look worse when given 10 bit data, even when they do support 10 bit (e.g. LG C7). This is because the internal video processing handles 10 bit worse than 8 bit. Try a blind test, at best most people have to make special test patterns to tell if 10 bit is really working... if you cannot tell whether or not it is working why care if it is?

Use 10 bit if you can and your display supports it properly, it is theoretically better, but do not make any sacrifices to get 10 bit. Also make sure to test it and decide what looks better to you, 10 bit could be worse.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 11th July 2019, 03:53   #56810  |  Link
jasonwc18
Registered User
 
Join Date: Jan 2018
Posts: 20
Quote:
Originally Posted by Asmodian View Post
Even for 10 bit sources. As soon as you send the video to madVR it is converted to 16 bit data (4:2:0 YCbCr -> RGB). During final presentation madVR dithers this to anything from 1 to 10 bit, as configured. The lower the bit depth the higher the noise. The difference between well dithered 8 and 10 bit output is basically undetectable, the increased noise is very subtle. Try comparing 6 bit to 8 bit output to see a stronger version of the effect. 6 bit compared to 8 bit is many times more significant than 8 bit compared to 10 bit but it is still pretty subtle. Dithering is remarkably powerful, another interesting test is to set madVR to 4 bit output and compare ordered dithering to dithering disabled.

It is different for lossily compressed content (e.g. everything we watch), this is because the dithering cannot be completely preserved by the lossy compression. 10 bit is important for HDR blurays on the disc, just not when we are watching them.

Some displays actually look worse when given 10 bit data, even when they do support 10 bit (e.g. LG C7). This is because the internal video processing handles 10 bit worse than 8 bit. Try a blind test, at best most people have to make special test patterns to tell if 10 bit is really working... if you cannot tell whether or not it is working why care if it is?

Use 10 bit if you can and your display supports it properly, it is theoretically better, but do not make any sacrifices to get 10 bit. Also make sure to test it and decide what looks better to you, 10 bit could be worse.
Thanks for the detailed explanation. I know that a lot of regular Blu-Rays have noticeable banding, whereas the same sources in 10 bit 4K Blu-Ray do not suffer the same banding. I didn't realize this was a result of compression rather than the bit depth of the raw output.

Moreover, I am likely in a similar situation given that I'll be watching HDR content on an LG OLED with the same video processing issue. In contrast, my JVC RS600 will handle 12 bit color just fine, but it's not bright enough to really enjoy HDR.

If I understand your post correctly, I should tell madvr to dither to 8 bit so that the NVIDIA driver doesn't do any further dithering to convert it to output in 8 bit?

Last edited by jasonwc18; 11th July 2019 at 03:59.
jasonwc18 is offline   Reply With Quote
Old 11th July 2019, 04:07   #56811  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by jasonwc18 View Post
If I understand your post correctly, I should tell madvr to dither to 8 bit so that the NVIDIA driver doesn't do any further dithering to convert it to output in 8 bit?
Good point. Definitely. Double dithering will have higher noise and I like madVR's dithering more.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 11th July 2019, 04:13   #56812  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
Yup exactly that. The less the driver does to the output on the way to the display the better.

Sent from my SM-G975U using Tapatalk
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 12th July 2019, 03:37   #56813  |  Link
alps006
Registered User
 
Join Date: Sep 2018
Posts: 22
Played around with the numbers under the setting "apply target dynamic nits selection", and no matter what I changed, I just felt the brightness was lacking and the picture looked a bit dull. eventually, I unchecked it, voila!!! About 20~30% more brightness and the picture looked great! In theory, should I leave it checked or unchecked? I prefer the latter. Thank you!

Last edited by alps006; 12th July 2019 at 04:50.
alps006 is offline   Reply With Quote
Old 12th July 2019, 12:49   #56814  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Leave it unchecked if you prefer brightness over everything else. The tradeoff is that you will clip the specular highlights as they appear. It is HDR. You need contrast to make it look accurate.

Lowering the dynamic tuning value will cause the targets to be lower and the image will get brighter as a result. But it won't be as bright as using the static display peak you entered.
Warner306 is offline   Reply With Quote
Old 12th July 2019, 12:58   #56815  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Given light of the banding issues introduced in windows 10 beyond 1607, have you guys gone back and retested those oleds ? maybe it's window's problem.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 12th July 2019, 20:59   #56816  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Dunno bout the issues with earlier versions but MS is investigating the banding issues reported with 1903. Once that's fixed and maybe some conclusive evidence on the cause of various latency issues I'll update.
ryrynz is offline   Reply With Quote
Old 12th July 2019, 21:08   #56817  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by tp4tissue View Post
Given light of the banding issues introduced in windows 10 beyond 1607
It wasn't just beyond 1607, it was on 1903 specifically and the OLED testing was not done with 1903 but with 1803 or 1809, or earlier.

Do you know about a banding issue other than the one introduced in 1903?

Edit: I did compare Windowed Overlay with FSE and Fullscreen Windowed on my C7 when I first got it, on 1703 I believe, and they all showed the same banding. Windowed Overlay bypasses the current issue on 1903 so I don't think the banding issues on the C7 (or C9) were caused by Windows. Windows has a new issue which adds some new banding but LG's OLEDs do have some banding on their own. The 10 bit issues on the C7 are worse and manifested when switching between 8 and 10 bit Windowed Fullscreen (or FSE) so also do not seem like they could have been related to the current banding issue.
__________________
madVR options explained

Last edited by Asmodian; 12th July 2019 at 22:43.
Asmodian is offline   Reply With Quote
Old 12th July 2019, 22:56   #56818  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by Asmodian View Post
It wasn't just beyond 1607, it was on 1903 specifically and the OLED testing was not done with 1903 but with 1803 or 1809, or earlier.

Do you know about a banding issue other than the one introduced in 1903?
That's exactly what I'm talking about, apparently dithering is disabled on nvidia by default. This caused banding issues with people who used monitor corrections.

They've figured out how to enable the dithering, but only unreliably in all versions POST 1607.

So it's either 1607 or Windows 7 for permanently enabling dithering.

How does this affect Madvr, or does it affect Madvr, I Don't Know.

https://hub.displaycal.net/forums/to...th-windows-os/
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 12th July 2019, 23:14   #56819  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
That does not impact madVR at all, Nvidia should not be dithering when using madVR. This is what were were just discussing above.

Edit: I was using the TV's two point white balance and madVR's 3DLUT for all color correction. I also tested with the TV's color correction disabled, just to see if it was to blame, but that had no impact on the banding. I have been unable to get good results with an ICC profile for a while, it might have been after 1603 to be honest but I hadn't connected it to the dithering.
__________________
madVR options explained

Last edited by Asmodian; 12th July 2019 at 23:22.
Asmodian is offline   Reply With Quote
Old 13th July 2019, 03:15   #56820  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
PC monitors that allow 10 bit input get usually a very good score on rtings while TV doesn't have anything close to that including LCD here. OLED just got more known for this issue because they even produce some banding with 8 bit input which is rare on LCDs.

banding with 10 bit output is a common issue on TVs it's sad because we know from gaming monitors which have very similar tech it is possible to get really good results. which is even more funny because games usually don't dither so they end up with banding anyway.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:52.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.