Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
12th May 2015, 15:13 | #41 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,903
|
Quote:
AMD uses by default 10 bit if possible and YCbCr. if you select 8 bit in the AMD driver it will only output 8 bit so yes it should be at 10 bit. first you should check if your Tv supports unlimted RGB than you have to do some test if your TV can dither 10 bit input or display it. if you don't fully understand how this works you should simply stick with dithered 8 bit madVR. i'm just guessing you are using AMD because nvidia and intel doesn't have such option in there driver yet. |
|
12th May 2015, 15:53 | #42 | Link |
Registered User
Join Date: Nov 2011
Posts: 48
|
Thanks
I have not a TV but a VPR. The functioning is different compared to a TV, there is not a fisical panel. the manufacturer declares support for 10bit, but if as you said ATI driver and EDID are not affidable the only way is to do comparison test. My VPR yes, support unlimited RGB. |
12th May 2015, 16:02 | #43 | Link |
Registered User
Join Date: Feb 2015
Posts: 31
|
It would interesting to test and confirm that a 10bit signal can pass from MadVR/MPDN through the video card without being dithered/altered along the way.
The only way I can think of to test this would be to use an HDMI capture card. The more expensive ones can capture/record uncompressed 10bit RGB. Even some of the less expensive cards can capture 4:2:0 10bit YUV. One could play some 10bit patterns out to the capture card. By analyzing the captured footage, it should be easy to determine if dithering occurred along the way. Anyone own a capture card capable of 10bit recording? Unfortunately I do not. |
12th May 2015, 16:12 | #44 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,903
|
Quote:
|
|
12th May 2015, 16:20 | #45 | Link | |
Registered User
Join Date: Jan 2007
Posts: 530
|
Quote:
I ask because I cannot see a difference in the test pattern when sending 10bit from madVR with the AMD HDMI_DisableDither either on or off, so it appears that it just sends the 10bit signal on to the monitor without futzing with it.
__________________
Win7Ult || RX560/4G || Ryzen 5 Last edited by noee; 12th May 2015 at 16:23. |
|
12th May 2015, 16:31 | #47 | Link | |
Registered User
Join Date: Sep 2013
Posts: 919
|
Quote:
MadVR dithering is limited to the display refresh rate (or eve slower, the movie frame rate?). Do you think FRC is faster than the display refresh rate? If so, it has an advantage over madVR in lower bit depths where you can still see the pattern. The faster the change the smoother and less noticeable the pattern is.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410. Last edited by James Freeman; 12th May 2015 at 16:37. |
|
12th May 2015, 17:09 | #48 | Link |
Registered User
Join Date: Dec 2011
Posts: 180
|
It boils down to the lack of real test pattern/tools for this sort of thing, it is not hard to tell if a panel do 8 or 10bit,
but to compare 8bit dither vs native 10bit vs 10bit dither? Probably need 3+ monitor attached to GPUs ~ and the only one that can make judgement is human eye~~ 8bit dither -> any bitdepth panel-> smooth gradient 10bit native -> 8 bit or lower panel-> banding 10bit native -> 10bit/8bit+frc panel -> smooth gradient 10bit dither -> any bitdepth panel -> smooth gradient I guess one can take picture then try to get the SNR, would be a nice scientific project ~ Last edited by baii; 12th May 2015 at 17:12. |
12th May 2015, 17:34 | #49 | Link |
Registered User
Join Date: Feb 2015
Posts: 31
|
True, this would be interesting to test as well. Though it should be easy to determine if 10 bit data was ever reduced to 8 bits. For example, if you sent a 10 bit gradient out to the capture card and then viewed a histogram of the captured gradient, I imagine you would see evenly spaced gaps/peaks in the histogram if it was reduced to 8 bits along the way. Anyway, if nobody has a capture card, I guess we'll never know.
|
12th May 2015, 17:41 | #50 | Link |
Registered User
Join Date: Sep 2012
Posts: 77
|
A few more questions if you don't mind: What's your monitor? Is it connected with DisplayPort or HDMI? Does the color format setting in CCC stay at RGB when you switch the output bit depth there to 10 bit?
I couldn't find any information regarding this. Let's just say, for me it's good\fast enough to prefer it over an additional portion of normal dithering |
12th May 2015, 18:00 | #51 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,903
|
Quote:
should be the PFL 4603 12 maybe... i use HDMI changing the bit deep in the driver doesn't change to YCbCr or RGB limited. it is by default set to 10 bit anyway. edit: it's a 42PFL4606H/12 i found this on the back of the screen Last edited by huhn; 12th May 2015 at 18:17. |
|
12th May 2015, 18:06 | #52 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,903
|
Quote:
i disabled the dithering with that registry "hack" and disabled madVR dithering. when i set the GPU to output 8 bit i saw clearly banding. when i set it to 10 bit i didn't saw banding only a little bit. looks like my 8 bit screen dithered the 10 bit from the GPU. it would be very strange if the GPU would dither 10 bit to 8 bit if it is set to output 10 bit and didn't dither it when set to 8 bit. so it is pretty much clear that the registry "hack" has disabled the dithering and my screen is doing this. |
|
12th May 2015, 18:19 | #53 | Link |
Registered User
Join Date: Sep 2013
Posts: 919
|
Good news, the Panasonic ST60 takes 10bit perfectly with Nvidia through HDMI if the digital color format is YCbCr444 Limited;
EDIT: It also takes RGB 10bit if it set to Limited range 16-235, but only 8bit if Full range 0-255. Or in short, it takes 8-10 (or more) as long as the range is limited. ryrynz. You've got the VT50 don't you? Try the test again, the VT50 and ST60 are practically the same.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410. Last edited by James Freeman; 12th May 2015 at 18:41. |
12th May 2015, 18:21 | #54 | Link | |
Registered User
Join Date: Sep 2012
Posts: 77
|
Quote:
Guess I should just install 14.12 and connect my U2212HM with DisplayPort someday to see what's going on there. Not that there will be any benefits for a 6 bit +A-FRC panel, but it's still interesting if it can handle a 10 bit signal at all. |
|
12th May 2015, 18:24 | #55 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,903
|
Quote:
i edit it in my post |
|
12th May 2015, 18:34 | #56 | Link | |
Registered User
Join Date: Feb 2015
Posts: 31
|
Quote:
Yes, it would be nice to 100% confirm than the gpu isn't performing any dithering. Sounds promising for AMD after the registry edit. Does any one know if something similar needs to be/can be done for Nvidia? |
|
12th May 2015, 21:47 | #60 | Link | |
Registered User
Join Date: Sep 2010
Posts: 321
|
Quote:
I'm going to test my 55VT60 panasonic plasma next and see the results of that. Edit: Just tested my 55VT60 and same thing, both appear to smooth out both gradient pat'erns that I use when going FSE. (also used the one in the OP) Using 13.12 AMD drivers on my 7870 while in RGB Limited. Maybe someone can answer something for me though, after I have verified that 10-bit is actually working on my setup, do I have to leave the calibration options the same or can I go back to how it was setup which is saying that my display is calibrated to BT709 and disable GPU gamma ramps was unchecked?
__________________
MPC-HC/MPC-BE, Lav Filters, MadVR CPU: AMD Ryzen 5 1600, Video: AMD Radeon RX Vega 56 -> TCL S405 55", Audio: Audio-Technica M50S Last edited by fairchild; 12th May 2015 at 22:04. |
|
Thread Tools | Search this Thread |
Display Modes | |
|
|