View Single Post
Old 11th May 2015, 22:50   #8  |  Link
luk008
Registered User
 
Join Date: Aug 2011
Posts: 38
Quote:
Originally Posted by MS-DOS View Post
To everyone:

AMD cards dither the output by default, but you can disable it:

http://www.monitortests.com/forum/Th...d=3314#pid3314


HDMI_DisableDither for HDMI. Reboot after.

Only that can explain why I see a difference between 8 and 10 bit output on my 6-bit DELL U2212HM connected with DVI
Disabling AMD dithering I discovered that my TV is in fact 8 bits. I can see a smooth gradient with 10 bits when dithering is enabled. So should I still use 10 bits or just stay with 8?
luk008 is offline   Reply With Quote