View Single Post
Old 23rd September 2018, 23:02   #52673  |  Link
blu3wh0
Registered User
 
Join Date: Feb 2014
Posts: 39
Quote:
Originally Posted by madshi View Post
It's too bad that your LG doesn't seem to have an option to use the same brightness for SDR?
Yup, I just went ahead and measured peak brightness for both SDR and HDR modes. I was completely surprised how maxing out OLED light and contrast still maintained a good default calibration, which could easily be perfected like my normal SDR setup for 120 nits. Anyway, HDR measured around 750 nits calibrated and SDR was only 413 nits per-calibration. This is significant enough that I would choose HDR all the time. I also went ahead and did a quick comparison between HDR to SDR in BT.2020 (which I'm surprised worked in SDR, although PC mode on my TV didn't like it so I had to switch to the one I use for HDR) against dynamically tone-mapped HDR, and it was pretty darn close aside from the brightness. Thinking this way, I'm surprised my TV's tone-mapping is actually pretty decent.

Quote:
Originally Posted by madshi View Post
It's already there! See "measure each frame's peak luminance".
Great! Unfortunately, it doesn't seem like it would do much to help output in HDR since it sounds like it'll fight with the TV's dynamic tone-mapping. BTW, HDR pixel tone-mapped HDR output is broken in the latest version, the colors are too dark and red toned.

Quote:
Originally Posted by madshi View Post
Which display output bitdepth exactly?
The video card bit depth output. I want to configure it so that if nvidia outputs 12-bit, madVR configures for 10-bit display. If nvidia outputs 8-bit, such as for 60Hz, madVR would set the display for 8-bit. Right now I'm doing it with refresh rate, but sometimes nvidia would screw up and output 8-bit at 23Hz.
blu3wh0 is offline   Reply With Quote