View Single Post
Old 10th March 2018, 00:57   #49538  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by mclingo View Post
another oddity. If I put my gfx card in 8 bit 442 and MADVR in 10 bit it looks ok and HDR works
When using 8-bit output from the gfx card always set madVR to 8-bit. You do not want madVR dithering to 10-bit and then the GPU dithering that to 8-bit, better to simply dither to 8-bit once.

Why not use the gfx card set to RGB 8-bit? At least on Nvidia, HDR still works when using 8-bit RGB output and the quality would be better than having madVR set to 10-bit.

It sounds like your TV is bad at processing 10-bit input.
__________________
madVR options explained
Asmodian is offline   Reply With Quote