Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 16th February 2019, 22:09   #54821  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 981
I also prefer the HDR images. Even at 100 target nits, the improvement in contrast is noticeable. Maybe the only complaint I'd have is it looks like the gamma is ever so slightly raised in the HDR shots, which you can notice in some of the changes to some color tones. But none of the images look overbright.
Warner306 is offline   Reply With Quote
Old 16th February 2019, 22:34   #54822  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 128
Quote:
Originally Posted by svengun View Post
HI SamuriHL, I also have GTX1060 and an LG OLED.

Do you use ToneMapping ? When applying ToneMapping for me, rendering for goes from 30ms to 55-60ms and makes it unwatchable

Only when I downscale a lot of settings (Chroma from NGU High to low, Image Upscaling to NGU low & downscaling to Spline, it is somewhat acceptable ( 30-35ms )

And strangely enough, I get the best values with 800 nits, while my 2015/16 OLED is rated at 540 nits

If you do use ToneMapping, would you mind sharing your settings ?

you're upscaling 4k to 8k ngu then downscaling to 4k ? < you shouldn't do this >

Because ngu will only touch chroma upscaling on 4k to 4k, LUMA is 1:1, so whatever ngu setting you put for LUMA upscaling doesn't affect performance


On my 1060, for 4K HDR, I have it on NGU medium (Chroma), w/ Tone Mapping to SDR + highlight recovery, it runs ~22-30ms. Luma is 1:1 untouched.

My 1060 runs at 2088mhz, it can go to 2112mhz, but it doesn't always auto boost to that high because madvr is not a saturating load at 22-30ms


For 1920x1080 upscale, setting is NGU high chroma, NGU high luma.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 16th February 2019 at 22:49.
tp4tissue is offline   Reply With Quote
Old 16th February 2019, 23:58   #54823  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 128
Quote:
Originally Posted by Warner306 View Post
I also prefer the HDR images. Even at 100 target nits, the improvement in contrast is noticeable. Maybe the only complaint I'd have is it looks like the gamma is ever so slightly raised in the HDR shots, which you can notice in some of the changes to some color tones. But none of the images look overbright.

The contrast is static no ? So, whatever tone mapping we do at 100nit , it's trading crush for highlights ?
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old Yesterday, 00:00   #54824  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 128
Quote:
Originally Posted by IceB View Post
I have red several threads on the HDR to SDR / UHD 10 bit to FHD comparison threads and finally decided to give it a go as there is more and more UHD 10bit HDR content.
After some learning curv with the latest build of madVR and testing the "Mehanik HDR10 test patterns", I have tuned my settings and made a comparison of the several frames of different versions ( FHD / UHD 10bit HDR ) of the same great action movie " Mission: Impossible - Fallout " .
I have also tested another movies and my conclusion is unequivocal - I definitely like the compressed UHD 10bit HDR to SDR by madVR in comparison to the FHD 8bit version on my rig. I watch movies in the dedicated HT room, small but all black walls and ceiling on 96" CARADA screen with good old JVC DLA-X30 that provides a very good black levels. The lamp has made around 2500 hours run, but still good with 100nits target on madVR . The pj was calibrated a while ago with my old MONACO XR DTP94 colorimeter for REC. 709 2.2 curve and since then I am using the i1pro spectro but waiting till I order a new lamp to get it re calibrated.

Since I have been looking for any detailed A/B images on the web and could not fine anything detailed and interesting (sorry if I missed something), I made for you a single frame comparison and uploaded it on my web site with the before/after view including the madVR settings data running on GTX 1070ti. Some frames have more obvious differences the others have less, yet as mentioned the UHD content compressed by the madVR has more clarity as a result with more details in near whites ( low highlight recovery is applied ), deeper and punchier colors with much more detailes and contrast darks that adds more depth to the picture. The textures are a bit sharper as well.

Enjoy and let me know what you think.

here

hover with the mouse over the screenshots

Have you compared the lut between colorimeter only and spectro+ colorimeter ?

Is it a huge difference ? I'm hunting for a spectro atm.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old Yesterday, 01:10   #54825  |  Link
IceB
Registered User
 
Join Date: Aug 2011
Posts: 21
Quote:
Originally Posted by tp4tissue View Post
Have you compared the lut between colorimeter only and spectro+ colorimeter ?

Is it a huge difference ? I'm hunting for a spectro atm.
Sure there is - different creature.
I got the spectro mostly for work. Will give the pj a go once I will get my hands on the new lamp. Profiling/calibration is time consuming.
IceB is offline   Reply With Quote
Old Yesterday, 01:12   #54826  |  Link
IceB
Registered User
 
Join Date: Aug 2011
Posts: 21
Quote:
Originally Posted by Warner306 View Post
I also prefer the HDR images. Even at 100 target nits, the improvement in contrast is noticeable. Maybe the only complaint I'd have is it looks like the gamma is ever so slightly raised in the HDR shots, which you can notice in some of the changes to some color tones. But none of the images look overbright.
I can definitely live with that. The benefits are obvious.
IceB is offline   Reply With Quote
Old Yesterday, 01:28   #54827  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 577
Quote:
Originally Posted by IceB View Post
since then I am using the i1pro spectro but waiting till I order a new lamp to get it re calibrated.
Don't waste your money with an x-rite "recalibration" of your i1pro. It's not a re-calibration, it's a re-certification. They read it, and they certify it if it's still within specs (which it should be as they don't drift unlike filter based colorimeter). They don't correct or change anything. If it's out of specs, it's "defective" and they charge you more to correct it. But if it was defective, you would know it, no? so unless you suspect something to be wrong (and are ready to pay more than the initial recert), I'd just skip this.

I wasted the money once, and when I saw that the numbers on their new certificate were exactly the same as on my original certification (the one you get when you buy the thing), I asked them and that's how I found out. So personally I won't do it again.

I guess it's worth doing it if you do pro calibrations, but they people expect a better spectro than an i1pro in that case.

I've had my i1pro2 for years and I don't believe it has drifted one bit. My projectors, on the other hand, especially after a few thousand hours...

Just my .2 cents of course, it's your money
__________________
MBP 13" 2018 Win10 Pro x64 b1809 MCE
i7 8559U@4.1Ghz 16Gb@3.8Ghz EVGA GTX 1080 Ti SC2 11Gb@2GHz 418.81 RGB Full 12bits
MPC-BE/LAV/MadVR/jRiver/MyMovies V5.25
Denon X8500H>Maestro>JVC RS2000
Manni is offline   Reply With Quote
Old Yesterday, 01:29   #54828  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 128
Quote:
Originally Posted by IceB View Post
Sure there is - different creature.
I got the spectro mostly for work. Will give the pj a go once I will get my hands on the new lamp. Profiling/calibration is time consuming.
I just meant by eye, did the spectro introduce a significantly different image vs colorimeter alone.

Not talking about measurement , since it's already known that spectro will improve delta-e greatly.


Also @ Manni, perhaps you could chime in on my question as well.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; Yesterday at 01:31.
tp4tissue is offline   Reply With Quote
Old Yesterday, 11:34   #54829  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,252
it depends on your end device and how far off your white point is with a colorimeter.

don'T forget this is mostly about the white point and if that white point is off doesn't matter that much as long as it is constantly off by the same margin so your eyes are fine and able fix the issue. one of the reasons you may not notice the drift.

the i1d3 drifts but not by much it also depends on how it is stored.
spectro meter drift like no one else business in comparison.
huhn is offline   Reply With Quote
Old Yesterday, 13:33   #54830  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 981
Quote:
Originally Posted by tp4tissue View Post
The contrast is static no ? So, whatever tone mapping we do at 100nit , it's trading crush for highlights ?
Yes, but you are rolling off the brightest information, which should be handled more gracefully. Raising the target nits slightly will change the color saturation, but then the image would be darker relative to the SDR screenshots. So you are right that there is a tradeoff.
Warner306 is offline   Reply With Quote
Old Yesterday, 16:26   #54831  |  Link
alexnt
Registered User
 
Join Date: Jan 2017
Posts: 9
Hi can some one explain me why cpu usage is so high?
Attachments Pending Approval
File Type: jpg Screen Shot 02-17-19 at 05.10 PM.jpg
alexnt is offline   Reply With Quote
Old Yesterday, 16:38   #54832  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 144
Quote:
Originally Posted by alexnt View Post
Hi can some one explain me why cpu usage is so high?
Upload the image to an external image site and post a link, as it takes a while to be cleared on site.

You will need to give more information on your setup and settings as well ?
__________________
HTPC: Windows 10 - v1809, I5 3570k, Asus GTX 1050 OC 2GB,
Drivers : 385.28, Yamaha RX-V377, Philips 65PUS6703 - 65"
madjock is offline   Reply With Quote
Old Yesterday, 16:54   #54833  |  Link
alexnt
Registered User
 
Join Date: Jan 2017
Posts: 9
Quote:
Originally Posted by madjock View Post
Upload the image to an external image site and post a link, as it takes a while to be cleared on site.

You will need to give more information on your setup and settings as well ?
https://postimg.cc/JstcR1SP

i5 4690k stock
r9 290 4gb stock
16gb ram
settings in madvr?
alexnt is offline   Reply With Quote
Old Yesterday, 17:13   #54834  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 128
Quote:
Originally Posted by alexnt View Post
https://postimg.cc/JstcR1SP

i5 4690k stock
r9 290 4gb stock
16gb ram
settings in madvr?
that's not high cpu usage, the core is only running at 35x instead of full , so it just looks higher.

But if you enable dx11 accleration in lav filter, it may bring that down further assuming the gpu accleration works on the file you're using.


If you delid and overclock to 4.8ghz , it'll look even lower.
Keep in mind, if you run any 4k files on 64bit mpchc cpu decode, it will use an avx code path, which may require lowering memory frequency to reach 4.8ghz stable.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; Yesterday at 17:17.
tp4tissue is offline   Reply With Quote
Old Yesterday, 17:29   #54835  |  Link
alexnt
Registered User
 
Join Date: Jan 2017
Posts: 9
Quote:
Originally Posted by tp4tissue View Post
that's not high cpu usage, the core is only running at 35x instead of full , so it just looks higher.

But if you enable dx11 accleration in lav filter, it may bring that down further assuming the gpu accleration works on the file you're using.


If you delid and overclock to 4.8ghz , it'll look even lower.
Keep in mind, if you run any 4k files on 64bit mpchc cpu decode, it will use an avx code path, which may require lowering memory frequency to reach 4.8ghz stable.
it goes from 35 to 38 and has some drops to around 20.
and render time is lower than I remember, it was around 18-20s.

https://postimg.cc/nXJ5cSvX
settings in lav filter

As i remember I had lower cpu usage and higher render time.
I had some time to watch a movie and yesterday when I switched to fullscreen monitor went black. It was fullscreen exclusive that had a problem(i never had it before). I use madvr for years with no problem.
So i uninstalled madvr, mpc before I figured out.
Then reinstalled and noticed those numbers.
Do you believe it ok?
alexnt is offline   Reply With Quote
Old Yesterday, 22:38   #54836  |  Link
KoD
Registered User
 
Join Date: Mar 2006
Posts: 545
Regarding the i1Pro re-certification, they measure 12 tiles or so to check if the measurement deviations are still inside a certain dE range, and do another check to see if repeated measurements of the white tile are within 0.1 dE. If it is, then that's all. If it isn't they can change the light source in the spectro with a new one. The "Certificate of Performance" is valid for only one year.

I have a request for madshi, regarding madVR: right now, madVR is able to passthrough the HDR metadata to the display. I would request to also be able to send the display the Dolby Vision metadata. This kind of metadata can now be preserved in the HEVC stream when using x265 3.0.

LG is providing mp4 video files for SDR / HDR / DolbyVision at these links, that can be used to test:
■ SDR - https://lgtca.box.com/v/SDR
■ HDR - https://lgtca.box.com/v/HDR
■ Dolby Vision - https://lgtca.box.com/v/DoVi

When using the files on a USB stick connected to one of the TV USB ports, the TV switches on HDR or DolbyVision as expected. I imagine the signaling in the video stream is all that's needed for that. PC games can also switch on Dolby Vision, so sending the metadata from a graphics card should also be possible.

As a side-note, for DolbyVision to work with Mass Effect Andromeda on the LG 65C8, the GPU needs to be configured to send RGB full range 8 bits, from my experiments. Using YcbCr will not work for enabling DolbyVision - you'll get a pink corrupted image.

These files are mentioned in the LG OLED 2019 AutoCal FAQ v2.0 pdf file that Ted posted on avsforum, as a way for users to make the TV switch in HDR or DV mode in order to be able to perform display calibration of those modes.
KoD is offline   Reply With Quote
Old Today, 00:49   #54837  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 128
Quote:
Originally Posted by alexnt View Post
it goes from 35 to 38 and has some drops to around 20.
and render time is lower than I remember, it was around 18-20s.

https://postimg.cc/nXJ5cSvX
settings in lav filter

As i remember I had lower cpu usage and higher render time.
I had some time to watch a movie and yesterday when I switched to fullscreen monitor went black. It was fullscreen exclusive that had a problem(i never had it before). I use madvr for years with no problem.
So i uninstalled madvr, mpc before I figured out.
Then reinstalled and noticed those numbers.
Do you believe it ok?

I don't know what you saw before/after.

if the dxva says inactive "While the movie" is playing. That means the gpu acceleration is not compatible with the file you're trying to play.

-In which case- it will default to CPU decoding. Different files have different compatibility.

But overall, it does not look high to me.

Again, you have other options, such as overclocking, if you need more oomph for hard to decode files. (not this one obviously)
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old Today, 01:07   #54838  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 357
Quote:
Originally Posted by alexnt View Post
Hi can some one explain me why cpu usage is so high?
https://postimg.cc/nXJ5cSvX
settings in lav filter
Check those LAV settings while playing a movie that gives high CPU load, and see which decoder is active. If it's avcodec then hardware decoding isn't active and that could explain it.
With hardware acceleration, CPU usage should be very low, definitely much lower than 66% @ 3,5 GHz
(Edit: do you have stuff like black bar detection enabled in madVR? That can use up some CPU, but I don't know how much exactly)
__________________
HTPC: W10 1803, E7400, 1050 Ti, DVB-C, Denon 2310, Panasonic GT60 | Desktop: W10 1803, 4690K, HD 7870, Dell U2713HM | Laptop: Insider Slow, i5-2520M | MediaPortal 1/MPC-HC, LAV Filters, ReClock, madVR

Last edited by el Filou; Today at 01:16.
el Filou is offline   Reply With Quote
Old Today, 02:08   #54839  |  Link
Alexkral
Registered User
 
Join Date: Oct 2018
Posts: 20
Quote:
Originally Posted by Asmodian View Post
Probably not, unless you are using a definition of dynamic range that also works for contrast. What is the difference, exactly? You never really increase the dynamic range or contrast of the screen, obviously, but the image looks like it has more contrast.
I missed your post but I think it's a good question. Dynamic range refers to the ratio between max and min absolute luminance values, while contrast refers to the relative difference in luminance between two or more values. By reducing the dynamic range you are increasing the contrast, because the relative range doesn't change, so you have the same space to represent a smaller range of absolute values, and the values ​​are more separated from each other.

The distinction between absolute and relative is most of the time the key to understanding the process and it's implications, but it is somewhat difficult because it is counterintuitive, because you are thinking about a reduction and assume that absolute things don't change and relative things do.

Think of it as if we had both ranges superposed and we started to extend the absolute range until the new lower absolute range would fit in the relative range. That's what happens when clipping. With tonemapping you are compressing all the values ​​that you would have otherwise lost at the top of the relative range, which limits the increase in contrast to the uncompressed values.

Last edited by Alexkral; Today at 02:31.
Alexkral is offline   Reply With Quote
Old Today, 06:39   #54840  |  Link
70MM
X Cinema Projectionist NZ
 
Join Date: Feb 2006
Location: Auckland NZ
Posts: 271
Does "dont render frames during fade in and fade out" do anything like add artifacts to the image?
If one doesnt need to use it if their rendering stats are low is it better to untick it?
70MM is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 07:38.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.