View Single Post
Old 21st March 2018, 20:45   #49694  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by huhn View Post
so why should i bother with sending 10 bit if i know for sure my TV is 8 bit FRC (like nearly every Tv out there, except what LG claism for the OLEDs but there is no screen with more banding problems...). why should i send 10 bit if i know for sure it doesn't matter for image quality? so i send 10 bit because the number is higher? is 10 bit even better if it gets dithered again?

seriously why can't people simply use there eyes to judge it.

if you want to know what an GPU send it easy with an AMD card it will always send what you select be default it is 10 bit if the device supports it. by nvidia well not that easy. in the past the bitdeep option was ignored(generally not a totally bad idea if you ask me) and was based on what was used for presentation.
I outlined why you should not use 10-bits. It covers all of that.

And, 10-bit RGB > 8-bit RGB > 10-bit YCbCr 4:2:2 > 10-bit YCbCr 4:2:0. That is a direct quote from madshi. He didn't clarify the quality difference between each setting.

Last edited by Warner306; 21st March 2018 at 20:47.
Warner306 is offline   Reply With Quote