Quote:
Originally Posted by mclingo
another oddity. If I put my gfx card in 8 bit 442 and MADVR in 10 bit it looks ok and HDR works
|
When using 8-bit output from the gfx card always set madVR to 8-bit. You do not want madVR dithering to 10-bit and then the GPU dithering that to 8-bit, better to simply dither to 8-bit once.
Why not use the gfx card set to RGB 8-bit? At least on Nvidia, HDR still works when using 8-bit RGB output and the quality would be better than having madVR set to 10-bit.
It sounds like your TV is bad at processing 10-bit input.