Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 12th May 2015, 15:13   #41  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by feelingblue View Post
Excuse me, but in the driver panel we must flag 10bit or not?

With EDID my video driver automatically sets YCbCr444 10bit
i think that this maybe a way to understand if the panel support 10bit

there is a way to read edid info?
the EDID doesn't tell you at all if your panel is 10 bit you only learn if your Tv support high bit deep input that's all and nearly all support 12 bit input.

AMD uses by default 10 bit if possible and YCbCr. if you select 8 bit in the AMD driver it will only output 8 bit so yes it should be at 10 bit.

first you should check if your Tv supports unlimted RGB than you have to do some test if your TV can dither 10 bit input or display it.

if you don't fully understand how this works you should simply stick with dithered 8 bit madVR.

i'm just guessing you are using AMD because nvidia and intel doesn't have such option in there driver yet.
huhn is offline   Reply With Quote
Old 12th May 2015, 15:53   #42  |  Link
feelingblue
Registered User
 
Join Date: Nov 2011
Posts: 48
Thanks

I have not a TV but a VPR.
The functioning is different compared to a TV, there is not a fisical panel.
the manufacturer declares support for 10bit, but if as you said ATI driver and EDID are not affidable the only way is to do comparison test.

My VPR yes, support unlimited RGB.
feelingblue is offline   Reply With Quote
Old 12th May 2015, 16:02   #43  |  Link
RenderGuy2
Registered User
 
Join Date: Feb 2015
Posts: 31
It would interesting to test and confirm that a 10bit signal can pass from MadVR/MPDN through the video card without being dithered/altered along the way.

The only way I can think of to test this would be to use an HDMI capture card. The more expensive ones can capture/record uncompressed 10bit RGB. Even some of the less expensive cards can capture 4:2:0 10bit YUV. One could play some 10bit patterns out to the capture card. By analyzing the captured footage, it should be easy to determine if dithering occurred along the way.


Anyone own a capture card capable of 10bit recording?

Unfortunately I do not.
RenderGuy2 is offline   Reply With Quote
Old 12th May 2015, 16:12   #44  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by RenderGuy2 View Post
It would interesting to test and confirm that a 10bit signal can pass from MadVR/MPDN through the video card without being dithered/altered along the way.

The only way I can think of to test this would be to use an HDMI capture card. The more expensive ones can capture/record uncompressed 10bit RGB. Even some of the less expensive cards can capture 4:2:0 10bit YUV. One could play some 10bit patterns out to the capture card. By analyzing the captured footage, it should be easy to determine if dithering occurred along the way.


Anyone own a capture card capable of 10bit recording?

Unfortunately I do not.
we don't even know if a 8 bit picture isn't altered.
huhn is offline   Reply With Quote
Old 12th May 2015, 16:20   #45  |  Link
noee
Registered User
 
Join Date: Jan 2007
Posts: 530
Quote:
Originally Posted by huhn View Post
the EDID doesn't tell you at all if your panel is 10 bit you only learn if your Tv support high bit deep input that's all and nearly all support 12 bit input.

AMD uses by default 10 bit if possible and YCbCr. if you select 8 bit in the AMD driver it will only output 8 bit so yes it should be at 10 bit.

first you should check if your Tv supports unlimted RGB than you have to do some test if your TV can dither 10 bit input or display it.

if you don't fully understand how this works you should simply stick with dithered 8 bit madVR.

i'm just guessing you are using AMD because nvidia and intel doesn't have such option in there driver yet.
Do you know if this is true with the 13.12 driver that does not actually have the bit depth option?

I ask because I cannot see a difference in the test pattern when sending 10bit from madVR with the AMD HDMI_DisableDither either on or off, so it appears that it just sends the 10bit signal on to the monitor without futzing with it.
__________________
Win7Ult || RX560/4G || Ryzen 5

Last edited by noee; 12th May 2015 at 16:23.
noee is offline   Reply With Quote
Old 12th May 2015, 16:22   #46  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by noee View Post
Do you know if this is true with the 13.12 driver that does not actually have the bit depth option?
it doesn't have this option.

i guess AMD is just outputting 10 bit in this case if possible it just a black box on this driver.
huhn is offline   Reply With Quote
Old 12th May 2015, 16:31   #47  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by MS-DOS
FRC works by combining four colour frames as a sequence in time, resulting in perceived mixture. In basic terms, it involves flashing between two colour tones rapidly to give the impression of a third tone, not normally available in the palette.
The question is how fast?
MadVR dithering is limited to the display refresh rate (or eve slower, the movie frame rate?).
Do you think FRC is faster than the display refresh rate? If so, it has an advantage over madVR in lower bit depths where you can still see the pattern.
The faster the change the smoother and less noticeable the pattern is.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 12th May 2015 at 16:37.
James Freeman is offline   Reply With Quote
Old 12th May 2015, 17:09   #48  |  Link
baii
Registered User
 
Join Date: Dec 2011
Posts: 180
It boils down to the lack of real test pattern/tools for this sort of thing, it is not hard to tell if a panel do 8 or 10bit,
but to compare 8bit dither vs native 10bit vs 10bit dither? Probably need 3+ monitor attached to GPUs ~ and the only one that can make judgement is human eye~~

8bit dither -> any bitdepth panel-> smooth gradient
10bit native -> 8 bit or lower panel-> banding
10bit native -> 10bit/8bit+frc panel -> smooth gradient
10bit dither -> any bitdepth panel -> smooth gradient

I guess one can take picture then try to get the SNR, would be a nice scientific project ~

Last edited by baii; 12th May 2015 at 17:12.
baii is offline   Reply With Quote
Old 12th May 2015, 17:34   #49  |  Link
RenderGuy2
Registered User
 
Join Date: Feb 2015
Posts: 31
Quote:
Originally Posted by huhn View Post
we don't even know if a 8 bit picture isn't altered.
True, this would be interesting to test as well. Though it should be easy to determine if 10 bit data was ever reduced to 8 bits. For example, if you sent a 10 bit gradient out to the capture card and then viewed a histogram of the captured gradient, I imagine you would see evenly spaced gaps/peaks in the histogram if it was reduced to 8 bits along the way. Anyway, if nobody has a capture card, I guess we'll never know.
RenderGuy2 is offline   Reply With Quote
Old 12th May 2015, 17:41   #50  |  Link
MS-DOS
Registered User
 
Join Date: Sep 2012
Posts: 77
Quote:
Originally Posted by huhn View Post
yes it does.

and going to FSE 10 bit mode take a very long time.
A few more questions if you don't mind: What's your monitor? Is it connected with DisplayPort or HDMI? Does the color format setting in CCC stay at RGB when you switch the output bit depth there to 10 bit?

Quote:
Originally Posted by James Freeman View Post
The question is how fast?
I couldn't find any information regarding this. Let's just say, for me it's good\fast enough to prefer it over an additional portion of normal dithering
MS-DOS is offline   Reply With Quote
Old 12th May 2015, 18:00   #51  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by MS-DOS View Post
A few more questions if you don't mind: What's your monitor? Is it connected with DisplayPort or HDMI? Does the color format setting in CCC stay at RGB when you switch the output bit depth there to 10 bit?
it's on old Phillips TV from 2011 week 11 (that's what's in the identification at least.).
should be the PFL 4603 12 maybe...
i use HDMI

changing the bit deep in the driver doesn't change to YCbCr or RGB limited. it is by default set to 10 bit anyway.

edit: it's a 42PFL4606H/12
i found this on the back of the screen

Last edited by huhn; 12th May 2015 at 18:17.
huhn is offline   Reply With Quote
Old 12th May 2015, 18:06   #52  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by RenderGuy2 View Post
True, this would be interesting to test as well. Though it should be easy to determine if 10 bit data was ever reduced to 8 bits. For example, if you sent a 10 bit gradient out to the capture card and then viewed a histogram of the captured gradient, I imagine you would see evenly spaced gaps/peaks in the histogram if it was reduced to 8 bits along the way. Anyway, if nobody has a capture card, I guess we'll never know.
i tested thsi already with my screen.

i disabled the dithering with that registry "hack" and disabled madVR dithering.

when i set the GPU to output 8 bit i saw clearly banding. when i set it to 10 bit i didn't saw banding only a little bit. looks like my 8 bit screen dithered the 10 bit from the GPU.

it would be very strange if the GPU would dither 10 bit to 8 bit if it is set to output 10 bit and didn't dither it when set to 8 bit.
so it is pretty much clear that the registry "hack" has disabled the dithering and my screen is doing this.
huhn is offline   Reply With Quote
Old 12th May 2015, 18:19   #53  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Good news, the Panasonic ST60 takes 10bit perfectly with Nvidia through HDMI if the digital color format is YCbCr444 Limited;
EDIT: It also takes RGB 10bit if it set to Limited range 16-235, but only 8bit if Full range 0-255.
Or in short, it takes 8-10 (or more) as long as the range is limited.

ryrynz.
You've got the VT50 don't you?
Try the test again, the VT50 and ST60 are practically the same.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 12th May 2015 at 18:41.
James Freeman is offline   Reply With Quote
Old 12th May 2015, 18:21   #54  |  Link
MS-DOS
Registered User
 
Join Date: Sep 2012
Posts: 77
Quote:
Originally Posted by huhn View Post
it's on old Phillips TV from 2011 week 11 (that's what's in the identification at least.).
should be the PFL 4603 12 maybe...
i use HDMI

changing the bit deep in the driver doesn't change to YCbCr or RGB limited. it is by default set to 10 bit anyway.
Thanks Couldn't find anything regarding "PFL 4603" though...

Guess I should just install 14.12 and connect my U2212HM with DisplayPort someday to see what's going on there. Not that there will be any benefits for a 6 bit +A-FRC panel, but it's still interesting if it can handle a 10 bit signal at all.
MS-DOS is offline   Reply With Quote
Old 12th May 2015, 18:24   #55  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by MS-DOS View Post
Thanks Couldn't find anything regarding "PFL 4603" though...

Guess I should just install 14.12 and connect my U2212HM with DisplayPort someday to see what's going on there. Not that there will be any benefits for a 6 bit +A-FRC panel, but it's still interesting if it can handle a 10 bit signal at all.
the name of the screen is 42PFL4606H/12

i edit it in my post
huhn is offline   Reply With Quote
Old 12th May 2015, 18:34   #56  |  Link
RenderGuy2
Registered User
 
Join Date: Feb 2015
Posts: 31
Quote:
Originally Posted by huhn View Post
i tested thsi already with my screen.

i disabled the dithering with that registry "hack" and disabled madVR dithering.

when i set the GPU to output 8 bit i saw clearly banding. when i set it to 10 bit i didn't saw banding only a little bit. looks like my 8 bit screen dithered the 10 bit from the GPU.

it would be very strange if the GPU would dither 10 bit to 8 bit if it is set to output 10 bit and didn't dither it when set to 8 bit.
so it is pretty much clear that the registry "hack" has disabled the dithering and my screen is doing this.

Yes, it would be nice to 100% confirm than the gpu isn't performing any dithering. Sounds promising for AMD after the registry edit. Does any one know if something similar needs to be/can be done for Nvidia?
RenderGuy2 is offline   Reply With Quote
Old 12th May 2015, 18:40   #57  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by RenderGuy2 View Post
Does any one know if something similar needs to be/can be done for Nvidia?
Nvidia does not dither.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 12th May 2015, 18:41   #58  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by RenderGuy2 View Post
Yes, it would be nice to 100% confirm than the gpu isn't performing any dithering. Sounds promising for AMD after the registry edit. Does any one know if something similar needs to be/can be done for Nvidia?
i don't think it is needed to even do this registry edit. it's better to not do it to be honest. if the AMD driver gets 10 bit and can't output 10 bit to the screen is will not dither to 8 bit and the result is bad.
huhn is offline   Reply With Quote
Old 12th May 2015, 18:50   #59  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
i just tried my HD4400 with my TV.

madVR said the output is 10 bit but the FSE change is instant and the picture looks like undithered 8 bit.
so i guess the intel isn't outputting 10 bit at all.
huhn is offline   Reply With Quote
Old 12th May 2015, 21:47   #60  |  Link
fairchild
Registered User
 
Join Date: Sep 2010
Posts: 321
Quote:
Originally Posted by baii View Post
I use a 16bit greyscale ramp png (converted from 10 bit test ramp.psd, which should be same as the 1 by AMD) that was made to test OPENGL 30bit support in photoshop.

It is much easier to spot if the panel do 10 bit.

https://mega.co.nz/#!OYoQ2IID!0CzRLl...R70VyqpKROztK4

PS: enable LAV filter RGB48.
I just tested my Sony 32EX400 LCD and it appears to work with dithering using the provided 10 bit test ramp, it's very easy to see and the screen does indeed seem to change modes when going FSE.

I'm going to test my 55VT60 panasonic plasma next and see the results of that.

Edit: Just tested my 55VT60 and same thing, both appear to smooth out both gradient pat'erns that I use when going FSE. (also used the one in the OP) Using 13.12 AMD drivers on my 7870 while in RGB Limited.

Maybe someone can answer something for me though, after I have verified that 10-bit is actually working on my setup, do I have to leave the calibration options the same or can I go back to how it was setup which is saying that my display is calibrated to BT709 and disable GPU gamma ramps was unchecked?
__________________
MPC-HC/MPC-BE, Lav Filters, MadVR
CPU: AMD Ryzen 5 1600, Video: AMD Radeon RX Vega 56 -> TCL S405 55", Audio: Audio-Technica M50S

Last edited by fairchild; 12th May 2015 at 22:04.
fairchild is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 14:43.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.