Quote:
Originally Posted by MS-DOS
To everyone:
AMD cards dither the output by default, but you can disable it:
http://www.monitortests.com/forum/Th...d=3314#pid3314
HDMI_DisableDither for HDMI. Reboot after.
Only that can explain why I see a difference between 8 and 10 bit output on my 6-bit DELL U2212HM connected with DVI
|
Disabling AMD dithering I discovered that my TV is in fact 8 bits. I can see a smooth gradient with 10 bits when dithering is enabled. So should I still use 10 bits or just stay with 8?