View Single Post
Old 25th September 2014, 02:17   #91  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,926
Quote:
Originally Posted by Zachs View Post
NVIDIA GPUs will output 12-bit via HDMI v1.3 and above if set to 16-bit in D3D10. So with the driver dithering 16-bit to 12, it is still better than 10-bit output. Can't remember where I read it from though (I'll update this post when I have time to dig up my history to see where I read this from).
yeah and you can force to output 12 bit with AMD too. that's not the point.
the point is is this properly used?

thanks to MPDN we can test this now. 10 bit displays are going to be totally mainstream in the future.

we don't know what is happening in the gpu and is 12 bit dithered/rounded from the GPU better for our 8 bit displays that have to get this to 8 bit too with there dithering/rounding compared to untouched dithered 8 bit RGB?

a lot of displays transform the input to 4:2:2 YCBCR how do they do this?
is the 12 bit dithered at all? is it better to send 10 bit dithered from the renderer because this can be used from the dispalys directly?

I can send 12 bit 4:4:4 RGB right now to my TV that accept this without downsampling it to 4:2:2 YCBCR. the display has to make 8 bit RGB out of it because this displays is for sure not 12 bit and 99.9% not 10 bit but why should the algorithm in my TV better than your dithering?
huhn is offline   Reply With Quote