Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 17th June 2018, 00:57   #51381  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
It is not the end of the world to be stuck with 8-bits. I'm sure even the 8-bit FRC displays have optimized video processing for 10-bit inputs (I don't know about 12-bit), but it can't hurt to send 8-bits.

I read a post from an engineer a while ago that said internal testing of 8-bit FRC and native 10-bit panels was showing such minute differences that 8-bit panels were starting to win out in the cost/benefit analysis for more expensive displays. This is probably why neither Samsung or Sony officially report the bit depth of its displays. huhn found a post a while ago showing that all of the 2017 Samsung 120 Hz panels are 8-bit. I looked up one of them and the professional reviewer reported it as a 10-bit panel, and it passed the 10-bit gradient test. Other parts sites for older models also show some UHD Samsung displays as 8-bit.
Warner306 is offline   Reply With Quote
Old 17th June 2018, 02:07   #51382  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
I know my 2017 LG OLED does better with 8 bit input over 10 or 12 bit (less banding).
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 17th June 2018, 02:07   #51383  |  Link
Crimson Wolf
Registered User
 
Join Date: Dec 2014
Posts: 51
Quote:
Originally Posted by XTrojan View Post
Anyone know how you can force Nvidia to switch to 12bit output whenever MadVR switches refresh rate to 23/24 Hz? It's at 8bit everytime so I have to go to settings and change to 12bit.
Is this with a 4K TV? I'm only 1080p but I do not have this problem. I can switch to any refresh rate 23,24,59,60 and 12-bit stays. Even through reboot.
Crimson Wolf is offline   Reply With Quote
Old 17th June 2018, 02:32   #51384  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
I read that guide the day you posted it. I thought it was spot on. I can vouch for the Samsung: HDMI Black Level - Normal/Low setting names. I uncheck the 'use inverse telecine' option in NCP. You might want to add the Manage 3D Settings/Power Management Mode = Adaptive. In that Windows Program Settings, I select the 'ADD' button and add every player.exe I use. Also, Configure Surround, PhysX/Processor = Your GPU. Your guide has no faults I can see.

A few driver releases ago, the 12bit no retention after reboot started. This is no fault of madVR or how it matches refresh rates. I suppose it would be cool if madVR forced a bit depth after honoring a refresh rate but why should it? That's on nVidia and if it used to work, it can work again once someone picks up the ball making any madVR 'fix' redundant.

I bet many of the 8bit panels are also 60Hz. Can't believe any native 120Hz/240Hz interpolated panels are only 8bit? I have an older Samsung. It's 10bit. Getting a little off-topic.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 17th June 2018, 02:36   #51385  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Isn't most 1080p aka SDR 8bit tops? You may be selecting 12bit driver output but you are dithering to 8bit for 1080p or SDR. Only UHD is higher than 8bit. Is this correct?

About the 12bit retention after reboot you say survives.…. Using RGB FULL or 4:2:2 Limited?
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 17th June 2018 at 02:44.
brazen1 is offline   Reply With Quote
Old 17th June 2018, 03:38   #51386  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
madVR uses 32 bit math and 16 bit surfaces, for output it can dither to any bit depth, 10 bit or lower. The GPU can then output 12 bit SDR 1080p or 8 bit HDR 2160p, bit depth, resolution, and HDR are not necessarily connected. The source bit depth is important only for the quality of the source, once we convert to RGB and scale for display we have >8 bit data even if we start with an 8 bit 1080p bluray.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 17th June 2018, 04:21   #51387  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
Quote:
Originally Posted by ryrynz View Post
Asked about this some years ago and madshi said 10-bit wasn't exposed as something that could be switched to outside of Nvidia's control panel. Maybe now that he has a contact he might be able to suggest this as a feature to add.. I hope they get 23Hz profiling and the bit depth bugs fixed this year...
Wouldn't that be nice. I've got the frame drop/repeat thing down to something I can live with but it's nowhere near perfect. It's watchable...that's about it. And yes, the fact that it's resetting 12 bit on every reboot isn't very fun.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 17th June 2018, 05:50   #51388  |  Link
Mano
Registered User
 
Join Date: Jul 2008
Posts: 54
Quote:
Originally Posted by Asmodian View Post
I prefer XySubFilter, though I believe there is a known issue madshi would like fixed I never encounter any problems with it and I am not sure what the issue is. It renders to madVR in higher quality than other options. xy-VSFilter over VSFilter for performance.
Reading that thread, XySubFilter and xy-VSFilter are difference entities right?
Mano is offline   Reply With Quote
Old 17th June 2018, 05:56   #51389  |  Link
XTrojan
Registered User
 
Join Date: Oct 2015
Posts: 88
Quote:
Originally Posted by Asmodian View Post
madVR uses 32 bit math and 16 bit surfaces, for output it can dither to any bit depth, 10 bit or lower. The GPU can then output 12 bit SDR 1080p or 8 bit HDR 2160p, bit depth, resolution, and HDR are not necessarily connected. The source bit depth is important only for the quality of the source, once we convert to RGB and scale for display we have >8 bit data even if we start with an 8 bit 1080p bluray.
Unsure what you mean by 8bit HDR 2160p, are you saying that with HDR 2160p enabled it always outputs at 8bit? The infoscreen states 10bit for me.

Either way, most people here seem to agree that if source content is 10bit or not, outputting 8bit instead of 12bit via GPU seems to be negligible?

Last edited by XTrojan; 17th June 2018 at 06:11.
XTrojan is offline   Reply With Quote
Old 17th June 2018, 09:14   #51390  |  Link
gfxnow
Registered User
 
Join Date: Jul 2004
Posts: 45
Quote:
Originally Posted by Mano View Post
Reading that thread, XySubFilter and xy-VSFilter are difference entities right?
Yes they are different. xy-VSFilter is the old school one. XySubFilter is theoretically better with MadVR and should be updated soon to address the italics bug discovered by Madshi.
gfxnow is offline   Reply With Quote
Old 17th June 2018, 10:43   #51391  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by XTrojan View Post
Unsure what you mean by 8bit HDR 2160p, are you saying that with HDR 2160p enabled it always outputs at 8bit? The infoscreen states 10bit for me.
I am saying that with HDR enabled it always outputs whatever you tell it to. HDR does not require it to be 8, 10, or 12 bit.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 17th June 2018, 11:54   #51392  |  Link
sat4all
Registered User
 
Join Date: Apr 2015
Posts: 62
Hi fellas,

My IPTV provider just added some 4k streams for the wordcup using HLG HDR, I guess it's an open standard format like HDR10.
Is there any plans for support with LAV and madVR? Does nvidia even support it?
__________________
ZOTAC MAGNUS EN1060K: Win 10 x64 + Kodi DSPlayer x64
LG OLED65C8 / Denon AVR-X3200W / KEF E305+ONKYO SKH-410 / Synology DS2415+ / Logitech Harmony 950

Last edited by sat4all; 17th June 2018 at 11:56.
sat4all is offline   Reply With Quote
Old 17th June 2018, 14:13   #51393  |  Link
veggav
Registered User
 
Join Date: Mar 2008
Posts: 80
i'm back with one more question:

Is it possible in the automatic switch to have it based on the source?
If 720p output 720p, 1080p output 1080p and 2160 output 2160?
And also have 23hz and 60hz each?
veggav is offline   Reply With Quote
Old 17th June 2018, 14:21   #51394  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
not sure why you should do that but yes.

just type all the modes in. but as a reminder i highly recommend to only use the native resolution of your screen and nothing else.
huhn is offline   Reply With Quote
Old 17th June 2018, 14:24   #51395  |  Link
SirSwede
Registered User
 
Join Date: Nov 2017
Posts: 69
Is there anyway to change the settings by mistake from one instance of Madvr running on one computer in your home network, to another instance of Madvr running on another computer in your home network, without having actively enabled LAN access?
SirSwede is offline   Reply With Quote
Old 17th June 2018, 14:28   #51396  |  Link
SirSwede
Registered User
 
Join Date: Nov 2017
Posts: 69
Is it just me or is NGU AA Chroma sharper than NGU Standard Chroma?
SirSwede is offline   Reply With Quote
Old 17th June 2018, 14:50   #51397  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
do you even look at an image that expose this?
huhn is offline   Reply With Quote
Old 17th June 2018, 15:46   #51398  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by sat4all View Post
Hi fellas,

My IPTV provider just added some 4k streams for the wordcup using HLG HDR, I guess it's an open standard format like HDR10.
Is there any plans for support with LAV and madVR? Does nvidia even support it?
HLG support would be nice. It is starting to become more common for TV broadcasts.
Warner306 is offline   Reply With Quote
Old 17th June 2018, 15:53   #51399  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
I read that guide the day you posted it. I thought it was spot on. I can vouch for the Samsung: HDMI Black Level - Normal/Low setting names. I uncheck the 'use inverse telecine' option in NCP. You might want to add the Manage 3D Settings/Power Management Mode = Adaptive.

I bet many of the 8bit panels are also 60Hz. Can't believe any native 120Hz/240Hz interpolated panels are only 8bit? I have an older Samsung. It's 10bit. Getting a little off-topic.
The power management setting could be mentioned. I don't think 3D needs to be discussed.

What display manufacturers are measuring when comparing 8-bit and 10-bit panels is noise. The video already has some noise, so detecting dithering noise is even harder to do. Their measurements seem to be showing imperceptible levels of noise in 8-bit panels compared to 10-bit panels, so they are still building 8-bit FRC panels, and some of these are high-end 120 Hz models. In some cases, you don't know until used parts are available what the actual bit depth is.

There is an article here from two years ago discussing this issue: https://www.avsforum.com/determining...nel-bit-depth/

Last edited by Warner306; 17th June 2018 at 15:56.
Warner306 is offline   Reply With Quote
Old 17th June 2018, 16:42   #51400  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,921
most of the high end are 8 bit and why not?

and modern 60 hz UHD screen are usually 10 bit even cheaper ones in the past nearly every FHD screen was 120 hz even the entry level ones.
and about UHD 240 hz not existing in consumer space... no not with VA these panels can't do 120 hz properly maybe in 5 years maybe with OLED.
and maybe 120 hz at 10 bit is not doable yet with v by one.
and the last "and" every site that looks for banding to see if a panel is 8 bit or 10 bit doesn't understand bit deep even a 6 bit screen will not produce banding if it is run correctly. this has gotten so bad that a FHD 6 bit FRC screen gets similar rating for bit deep as a 10 bit or 8 bit UHD screen and UHD helps a lot in hiding dither noise.

edit: for you warner my pana uses normal, full and auto, auto is working fine on my set.

Last edited by huhn; 17th June 2018 at 16:49.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 16:19.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.