Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 28th January 2020, 18:12   #58441  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
Hi in beta 113 tone mapping what is the different between peak luminance and Dynamic target nits, just wondering what the best settings should be for my 400 nit OLED for both of these to maximise bright small objects in HDR movies, I know they are not going to be super bright but when I recently watched the EXPANSE in HDR on Amazon it looked way better than any movie i've watched through MADVR. I know I cant do an A-B test for this material but I'd like to know what the best settings should be.
__________________
LG OLED EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 RX 5700 - https://www.videohelp.com/software/madVR/old-versions
mclingo is offline   Reply With Quote
Old 28th January 2020, 22:50   #58442  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
Your real display nits should be set to 400. Dynamic target is NOT what people seem to be thinking it is. I would start at 75 and if you find that too dark/not to your liking, drop it to 50. I have mine set to like 150 or something crazy right now. It's personal taste on that particular setting...
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 28th January 2020, 23:17   #58443  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
thanks.

I tried 50, 75, 100, 150 and 400, i genuinly could see no difference. you said if 75 is too dark drop it to 50, does this mean the lower the value the bright the highlights, what exactly is this doing?

edit: ok i see it now, I had to go to either extreme to really see the difference though, I compared 0 to 500, 500 being much duller more like SDR.

if you have an OLED, surely you want the brightest image possible, surely you'd set this at 0?

edit 2: ok so the lower it is the more highligh detail you loose, theres a balance you have to find, i settled on 25 believe it or not
__________________
LG OLED EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 RX 5700 - https://www.videohelp.com/software/madVR/old-versions

Last edited by mclingo; 28th January 2020 at 23:41.
mclingo is offline   Reply With Quote
Old 29th January 2020, 00:35   #58444  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
Yea, exactly what you said, you lose detail so it's a matter of preference to find the balance you're looking for. Fun game isn't it?
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 29th January 2020, 14:12   #58445  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
Indeed, I spent last nigt runing through some 4k demos and movies with my new settings, never looked so good, to think i'd been languishing with dim HDR performance for so long thinking my TV was pants , this doesnt say much for LG's own tone mapping though as MADVR now looks loads better the passthrough HDR, well happy with the results.
__________________
LG OLED EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 RX 5700 - https://www.videohelp.com/software/madVR/old-versions
mclingo is offline   Reply With Quote
Old 29th January 2020, 18:05   #58446  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
LG's DTM is an acquired taste. One that sadly I never acquired. LOL I VASTLY prefer madvr's tone mapping to LG's. We can't get rid of LG's entirely on our OLED's, but, we can limit its impact by tone mapping to 700 nits or less (or in your case 400). The work that's been done is tremendous.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 31st January 2020, 06:34   #58447  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 315
Anyone with AMD 5700XT with working HDR? in Movie Games and Youtube/Chrome 4K HDR? I have the 1080GTX now and there are no issues. but will I will change gpu for FPS improvement.

Because someone said there is an issue with 5700XT HDR and VEGA IIV or something works fine. I am waiting for the big Navi, I hope there won't be issues.
x7007 is offline   Reply With Quote
Old 31st January 2020, 10:40   #58448  |  Link
WuNgUn
Registered User
 
Join Date: Dec 2019
Posts: 94
Big Navi isn't expected anytime soon...? I thought like summertime?
WuNgUn is offline   Reply With Quote
Old 31st January 2020, 14:05   #58449  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
[QUOTE=x7007;1897782]Anyone with AMD 5700XT with working HDR? in Movie Games and Youtube/Chrome 4K HDR? I have the 1080GTX now and there are no issues. but will I will change gpu for FPS improvement.

Because someone said there is an issue with 5700XT HDR and VEGA IIV or something works fine. I am waiting for the big Navi, I hope there won't be issues.[/HDRQUOTE]

HDR shouldnt be working properly for anyone in that its not sending BT2020 and saturation is low if the app uses AMD API. Workaound is to turn on windows HDR first, this seems to solve the issue for MADVR playback but I havent tried this with games as nobody has been able to give me an example of a game that uses AMD's Private API for me to test.

I cant see big navi not having this issue but I guess its 50/50, if you dont play 3D movies i'd consider an NVIDIA card right now.
__________________
LG OLED EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 RX 5700 - https://www.videohelp.com/software/madVR/old-versions
mclingo is offline   Reply With Quote
Old 31st January 2020, 14:23   #58450  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
maybe just wait for the GPU. it's not even clear if it is a consumer card or a professional card or if we will even see one at all.

the 5700 XT is eat 225 watts a bigger card would be insane with the same node.
huhn is offline   Reply With Quote
Old 1st February 2020, 11:27   #58451  |  Link
eddman
Registered User
 
Join Date: Dec 2009
Posts: 77
How do I mirror a video with madvr? I can do it with EVR custom by pressing Alt+6, but doesn't work with madvr.
eddman is offline   Reply With Quote
Old 1st February 2020, 19:01   #58452  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
madVR does not support mirroring but it does support rotation.
Control + Shift + Alt + Right Arrow | Clockwise
Control + Shift + Alt + Left Arrow | Counterclockwise

Why do you want to mirror the video?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 1st February 2020, 20:12   #58453  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by SamuriHL View Post
LG's DTM is an acquired taste. One that sadly I never acquired. LOL I VASTLY prefer madvr's tone mapping to LG's. We can't get rid of LG's entirely on our OLED's, but, we can limit its impact by tone mapping to 700 nits or less (or in your case 400). The work that's been done is tremendous.
Don't worry, OLED is obsolete THIS YEAR.

Hisense LMCL (dual layer LCD) will be available quarter 3 USA, 2020

Everyone said it was impossible,, then I were like, naw dawg, 2 years tops.

And here we are only 1 Year after.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 1st February 2020, 21:25   #58454  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Dual layer LCD hardly makes OLED obsolete.

Self emissive pixels is the future of displays, dual layer LCD is a temporary stopgap. OLED is also a stopgap but at least it is self emissive. Didn't we see a dual layer LCD at last year's CES? I don't think anyone said it was impossible, just not very bright, still not perfect blacks, and they have the same LCD issues with viewing angles. 2 million zone FALD is way better than 1000 zone FALD but it is still not an ideal display technology.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 1st February 2020, 21:56   #58455  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
When we see a 4000 nit self emissive display then we can talk about technology becoming obsolete.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 1st February 2020, 22:37   #58456  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by Asmodian View Post
Dual layer LCD hardly makes OLED obsolete.

Self emissive pixels is the future of displays, dual layer LCD is a temporary stopgap. OLED is also a stopgap but at least it is self emissive. Didn't we see a dual layer LCD at last year's CES? I don't think anyone said it was impossible, just not very bright, still not perfect blacks, and they have the same LCD issues with viewing angles. 2 million zone FALD is way better than 1000 zone FALD but it is still not an ideal display technology.
Personally I would rather buy a dual-layer LCD then an OLED. OLEDs downsides are just so annoying when they start hitting. Nevermind that I regularly use my TV at average lighting conditions, where higher brightness helps.

Perfect blacks? Well sure, nothing but self-emissive will ever reach 0, but blacks in the range of 0.00003 nits and a static contrast of 1.000.000:1 (1000:1 squared due to two LCD filters) is quite substantial - and with a significantly higher peak brightness then OLED. And no burn-in or ABL.

Viewing Angles are an old LCD problem and there is a lot of technology to work around it, so I'm not all that worried. And afterall, how off-center do viewers usually sit.

I do hope that 2020 might see the first production TV with that tech, but it does not appear that the major TV makers are quite on board with that yet. Maybe they are afraid it would cannibalize their investments into other tech too much. (Only Hisense and Panasonic openly confirmed that they are working on this technology).
But that announced Hisense TV at least looks pretty good on paper, with the cited black values and contrast above, with full DCI-P3 coverage and 1000 nits peak brightness. Well, we'll see later this year if it holds up, or gets delayed.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 2nd February 2020 at 00:04.
nevcairiel is offline   Reply With Quote
Old 2nd February 2020, 00:34   #58457  |  Link
toki
Registered User
 
Join Date: Apr 2019
Posts: 69
ok, I'm back. I'm having trouble fine tuning to my liking 100%. I'm using sharpen edges at 0.6, I've tried multiple other combos, but I still can't find anything that I'm 100% completely satisfied with. What would your suggestions be? I am on a Sony X950G.
toki is offline   Reply With Quote
Old 2nd February 2020, 07:05   #58458  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by tp4tissue View Post
Don't worry, OLED is obsolete THIS YEAR.
So far from the truth it's laughable. OLED is ramping up in a big way, more manufacturers, smaller screens, lower prices, lower power consumption.
Unless you have a FALD screen you cannot compete with this technology and even then you're typically at a disadvantage in multiple areas.
However.. the Samsung Q90R is very nice as far as FALD screens go and the only comparable set to an OLED AFAIC, it's a shame it comes at a premium being an 8K set.
QD-OLED vs LG OLED is the battle for premium coming soon. Samsung knows they need self-emissive tech to truly compete.

Quote:
Originally Posted by toki View Post
ok, I'm back. I'm having trouble fine tuning to my liking 100%.
Sounds like you should be fine tuning or calibrating your set, that's always the staring point and this is not the place to be for that. Nice TV though.

Last edited by ryrynz; 2nd February 2020 at 07:10.
ryrynz is offline   Reply With Quote
Old 2nd February 2020, 09:41   #58459  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by nevcairiel View Post
Personally I would rather buy a dual-layer LCD then an OLED. OLEDs downsides are just so annoying when they start hitting. Nevermind that I regularly use my TV at average lighting conditions, where higher brightness helps.

Perfect blacks? Well sure, nothing but self-emissive will ever reach 0, but blacks in the range of 0.00003 nits and a static contrast of 1.000.000:1 (1000:1 squared due to two LCD filters) is quite substantial - and with a significantly higher peak brightness then OLED. And no burn-in or ABL.

Viewing Angles are an old LCD problem and there is a lot of technology to work around it, so I'm not all that worried. And afterall, how off-center do viewers usually sit.

I do hope that 2020 might see the first production TV with that tech, but it does not appear that the major TV makers are quite on board with that yet. Maybe they are afraid it would cannibalize their investments into other tech too much. (Only Hisense and Panasonic openly confirmed that they are working on this technology).
But that announced Hisense TV at least looks pretty good on paper, with the cited black values and contrast above, with full DCI-P3 coverage and 1000 nits peak brightness. Well, we'll see later this year if it holds up, or gets delayed.
Precisely. Add to that OLED is being phased out for Color Grading.

Look at it this way, IF they're no longer grading on OLED, what use is OLED. Why would you buy something the disc was not graded for.

Sony has phased out their oled hdr grading monitor FOR panasonic's LMCL, Ontop of that, Panasonic released their LMCL 50" Megacon large grading monitor

WHY NOT OLED, both color and brightness are unstable, that makes them inherently Unreliable. It will ALWAYS be limited by ABL.

Contrast is also relative, just like you don't NEED 10,000 nits to give a good impression of Daylight, you also don't NEED 0 nits to give a good impression of Black.

In fact 0.01 nit is black enough for anything except in a Pitch black room, attempting to display Another pitch black room.


LMCL is the future, it is compatible with ALL backlight strobe technology for motion clarity

It will NOT burn in,

It can hit the 4,000 nit upcoming target.

OLED, will probably NEVER hit 4,000. Even if it did, rapid burn in.

LMCL, all day 4,000 nit, won't break a sweat. It is even ready to do 10K nit Today, if only people could afford it.

For OLED to catch up to even 2K nit, it'd have to be a completely different chemistry. So they'd have to invent that, THEN build new factories. All the while LMCL will be absorbing all the High end Dollars.

OLED will always be there for mobile , but big format, it's very likely game over.

The downsides of OLED are too many, perfect black just isn't that important.

With LMCL, theoretically, using 2x VA panels, with an X-wide filter, 7000*2 / 3 = 16.3 Million :1 Contrast ratio.

This will give, 0.000061 nit blacks @ 10,000 NIT, Your non-industrial colorimeter can't even reliably measure that low. ~ 0.003

In terms of Consumer usage scenarios, more and more people are using their TVs as PC monitors, or cast interfaces, these will all have many static elements, something OLED simply can not endure.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 2nd February 2020 at 10:19.
tp4tissue is offline   Reply With Quote
Old 2nd February 2020, 10:55   #58460  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the laws of thermodynamic. i guess let's just ignore them.

what is an LCD?
it's not a diode it doesn't produce light no it made to "block" light (change it so the polarisation filter blocks more) so by using two you get more brightness? by letting the light pass through more objects which is work.
not saying it will massively lower light output but it will clearly not increase it.
and do i have to explain some serious heat/power consumption problems if you are not managing the light output using the backlight but only the LCD by blocking the light with a polarisation filter only.

what produces the light in a LCD TV.
the backlight as everyone should know CCFL, LED or even OLED is a possibility this part is the most important part to get more nits what makes dual layer LCD use a different backlight and stops single layer form using the same backlight?

and just for fun the micro led from sony is an oled.
BTW. you don't master stuff for an display technology or for a similar or the same display.
the whole point of calibration is to reproduce the same color on all device types.

until the technology is out and proofed nothing changes it's just marketing garaged and you are falling for it.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 23:10.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.