View Single Post
Old 12th February 2019, 09:44   #54679  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by RXP View Post
I've just got my PC hooked up with my TV and enjoying using MadVR especially on the JVC RS49.

I'd like to use MadVR's tone mapping with my LG C6. I use pixel shaders and pass the image through as HDR to benefit from the higher peak brightness of my TV in HDR.

My question is: Due to the Nvidia/Windows bugs where bogus meta data of 1000/20 is sent; will the internal tone mapper LG has throw off the results of MadVR? If I set the peak nits of my display to 600, LG's tone mapper should be bypassed as MadVR will report all content as 600 nits or below. But if the OS is saying it's 1000 nits - it'll be applying its own curve?
You could use the drivers it works on 385.28 (using them now). But I have to say this metadata thing news is news to me, I have tried to find other information, and have found the odd quote.

But for some reason in all the guides in using madVR and the likes, there is no mention of use this or this driver for HDR (due to metadata) so it seems like its a well known problem (or is it) thats just getting talked about again ?

https://forums.geforce.com/default/t...data/?offset=2

Last edited by madjock; 12th February 2019 at 09:51.
madjock is offline   Reply With Quote