View Single Post
Old 13th October 2017, 04:36   #46512  |  Link
austinminton
Registered User
 
Join Date: Jun 2017
Posts: 71
Finally have managed to dump a few of my UHDs and have a bigger sample set for hdr files to play with madvr. My main question is related to what is the right nvidia setting to set in the control panel. I have been using rgb 4k@60 8bit till now but I wonder if that's incorrect to playback 10bit files. When madvr switches to 4k@23, does it also set 10bit? I have tried looking into all my tv/avr settings and can't find anything that can confirm bitdepth and color format unfortunately.

I have set 10bit and above in madvr for the panel (Sony 75z9d). Should I set the control panel to YCbCr 422 and 10bit?

I also have a file with 4k@60, 4:2:0, 10 bit HDR. Yes its a 60hz UHD. This one really confuses me since i know rgb 4k@60 10 bit wont work over hdmi. So somewhere its either getting converted to 8 bit or a lower chroma? It plays fine and looks very good to my eyes, but I 'need' to know if some unnecessary conversion is getting done somewhere. Madvr OSD says P010, 10-bit, 4:2:0 but I am not sure if that's what the GPU is finally passing to the panel?

Card is nvidia 1080 (385.41), latest madvr + lav filters. I am using FSE and madvr auto switch resolution for 4k@23, 4k@24, 4k@60. DXVA copyback in LAV.

Also I would like to say that I think my pc with madvr has a much better picture than my uhd player. Thanks madshi.

Last edited by austinminton; 13th October 2017 at 04:41.
austinminton is offline   Reply With Quote