Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 8th January 2019, 22:00   #54161  |  Link
NM20
Registered User
 
Join Date: Jan 2019
Posts: 13
O.k. I have read a fair bit and you guys (and gals) seem to know what you are talking about.

I have a 1080ti and sending the image to a JVC projector that is 4k capable.

My questions are:

I am using 'tone map HDR using pixel shaders'. Would ticking 'measure each frame's peak luminance' be the correct thing to do?

Also, where do I check the DXVA scaling/decoding? Is it in LAV filters?

Thanks
NM20 is offline   Reply With Quote
Old 8th January 2019, 22:04   #54162  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
Quite true. That's where the compromises come into play if you're on a performance limited card, however. Do you lean more toward the HDR processing (I am right now) or toward chroma upscaling? It really depends on what you've got under the hood and the content you're trying to watch. And that's also where profiles can come into play. Something I've still not delved into myself yet but really should soon. Cause you can crank up chroma upscaling if you're e.g. scaling a blu-ray to UHD resolution and sacrifice chrome upscaling to save performance for HDR tonemapping when watching UHD content. That's just one example, but, you get the idea.

At the moment I'm fighting with my 1060, AVR, and TV to get my picture to not be cut off. I tried activating scaling in the nVidia control panel but it kills HDR as soon as I enable it. So I'm stuck with native res that's now for whatever reason getting cut off on the TV. I really can't win. LOL
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 8th January 2019, 22:26   #54163  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by SamuriHL View Post
However, whether you can tell the difference between NGU High AA or bicubic75 or not is an individual thing and only you can decide with your eyes if you can see a difference.
Yeah, there will be scenes where I think I would notice a difference but it's not a question of that, it's that I know from my own investigations that NGU AA is superior to Bicubic 75 and I have the performance to use it, the card is there to be used to why not eek everything I think is worth it out of it? Nothing wrong with Bicubic 75 though, it's one of the better lower end choices, nobody is going to knock you for using that.

Quote:
Originally Posted by SamuriHL View Post
It really depends on what you've got under the hood and the content you're trying to watch. And that's also where profiles can come into play.
Very much this, people need to be using profiles, they're so easy to set up.
For full HD content on my 1080 I do a 2x supersample with NGU sharp and downscale with SSIM 2D, it's demanding as hell but because I have no other sharpening on my set or with madVR the resulting picture is very nice.. the edges are sharp and the picture does not look sharpened at all and the whole thing has a very high res sort of pop to it.
For hand drawn animated content I don't do this because the detail isn't there (most anime is actually captured at 720P, so much of it is upscaled anyway) and I've tested what I gain from supersampling this and it's next to nothing.. You got to know where to invest those resources. From the 750ti->960->1060 6GB-> 1080 it's been a balancing act I think I've managed particularly well, not a lot has changed when it comes to non HDR content upscaling/processing.

Quote:
Originally Posted by NM20 View Post
O.k. I have read a fair bit and you guys (and gals) seem to know what you are talking about.

I am using 'tone map HDR using pixel shaders'. Would ticking 'measure each frame's peak luminance' be the correct thing to do?
It's something you can do for better highlights, there's nothing right or wrong about any of the options. If you can enable it without frame drops then do it.

Quote:
Originally Posted by NM20 View Post
Also, where do I check the DXVA scaling/decoding? Is it in LAV filters?
Yes, those options are under 'Hardware decoder to use'.
You may wish to select the chroma DXVA options in 'Trade quality for performance' if you want the best performance from your card at the expense of image quality, evaluate for yourself.

Last edited by ryrynz; 8th January 2019 at 22:43.
ryrynz is offline   Reply With Quote
Old 8th January 2019, 23:24   #54164  |  Link
NM20
Registered User
 
Join Date: Jan 2019
Posts: 13
Quote:
Originally Posted by ryrynz View Post
Yes, those options are under 'Hardware decoder to use'.
You may wish to select the chroma DXVA options in 'Trade quality for performance' if you want the best performance from your card at the expense of image quality, evaluate for yourself.
What option should I be picking in LAV with a 1080ti to get the best performance and the HDR tone mapping? CUVID? Native? Copy back?

Thanks for the help so far.
NM20 is offline   Reply With Quote
Old 9th January 2019, 00:14   #54165  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
Quote:
Originally Posted by ryrynz View Post
Yeah, there will be scenes where I think I would notice a difference but it's not a question of that, it's that I know from my own investigations that NGU AA is superior to Bicubic 75 and I have the performance to use it, the card is there to be used to why not eek everything I think is worth it out of it? Nothing wrong with Bicubic 75 though, it's one of the better lower end choices, nobody is going to knock you for using that.
I only changed to bicubic75 in the latest test build because I'm preferring the tonemapping options right now as they are quite impressive. Eventually I'll tweak everything and settle on a happy medium somewhere. I agree wholeheartedly that NGU AA absolutely rocks and I can definitely see a difference depending on content. I'll get back to it. Then again, I'm looking at some rebuilding of machines this year at some point so I'll have the horsepower to crank everything back up. The 1060 was a compromise when I bought it. I really wanted a 1070 but the stupid crypto nonsense was going on and I couldn't find one local and online prices were ridiculous. Everything is a compromise.

Quote:
Originally Posted by ryrynz View Post
Very much this, people need to be using profiles, they're so easy to set up.
For full HD content on my 1080 I do a 2x supersample with NGU sharp and downscale with SSIM 2D, it's demanding as hell but because I have no other sharpening on my set or with madVR the resulting picture is very nice.. the edges are sharp and the picture does not look sharpened at all and the whole thing has a very high res sort of pop to it.
For hand drawn animated content I don't do this because the detail isn't there (most anime is actually captured at 720P, so much of it is upscaled anyway) and I've tested what I gain from supersampling this and it's next to nothing.. You got to know where to invest those resources. From the 750ti->960->1060 6GB-> 1080 it's been a balancing act I think I've managed particularly well, not a lot has changed when it comes to non HDR content upscaling/processing.
Profiles are absolutely my next time investment. I just haven't gone there yet but it's definitely needed now. When I get back from vacation that's top of the list.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 9th January 2019, 00:16   #54166  |  Link
SamuriHL
Registered User
 
SamuriHL's Avatar
 
Join Date: May 2004
Posts: 5,351
Quote:
Originally Posted by NM20 View Post
What option should I be picking in LAV with a 1080ti to get the best performance and the HDR tone mapping? CUVID? Native? Copy back?

Thanks for the help so far.
Unless you NEED copy back for whatever reason (black bar detection, deinterlacing, etc), I would personally recommend D3D11. It's what I've been using for a long time now and it works very well.
__________________
HTPC: Windows 11, AMD 5900X, RTX 3080, Pioneer Elite VSX-LX303, LG G2 77" OLED
SamuriHL is offline   Reply With Quote
Old 9th January 2019, 00:17   #54167  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Should really be asked in the LAV forum, but use D3D11 preferably as Samurai said.
ryrynz is offline   Reply With Quote
Old 9th January 2019, 00:17   #54168  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Definitely D3D11 Native with pixel shader tone mapping.
Warner306 is offline   Reply With Quote
Old 9th January 2019, 01:16   #54169  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by huhn View Post
if your screen can do bt 2020 natively it would do something if it is close to DCI P3 it should do nothing at all.




who knows literally what ever they want. HDR tonemapping doesn'T have a real one way is correct spec.
the difference should be the image dynamic until it hits it max brightness then it does what ever it wants.

as i said before there is no clear answer.

So, everything pretty much work as expected now on my 1050ti and 1060. thx alot huhn.

However, my basement ATI 7870xt isn't showing the measure screen nit and the highlight restoration seems to have no effect.. is it not hooking into the old gpu's direct compute ?
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 9th January 2019, 02:15   #54170  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Quote:
Originally Posted by glc650 View Post
I just upgraded from a Gigabyte Radeon HD 7870 to a XFX Radeon RX 580 GTS and now I get dozens of presentation glitches every second when playing any video if the HDMI Scaling slider is set above 0% in the AMD Radeon Settings GUI.
Sounds like an AMD driver bug then. As scaling for TV overscan compensation is one of the very last driver steps before sending out the picture, it may explain the presentation glitches.
Have you tried exclusive mode? Does your TV really not have any mode with no overscan, even an hidden one like in a service menu or something?
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 9th January 2019, 04:55   #54171  |  Link
glc650
Registered User
 
Join Date: May 2003
Posts: 77
Quote:
Originally Posted by el Filou View Post
Sounds like an AMD driver bug then. As scaling for TV overscan compensation is one of the very last driver steps before sending out the picture, it may explain the presentation glitches.
Have you tried exclusive mode? Does your TV really not have any mode with no overscan, even an hidden one like in a service menu or something?
If it was an AMD bug wouldn't this have shown up on my previous AMD based card though?

I tried exclusive mode but it made no difference.

My TV is pretty old (Mitsubishi LaserView, purchased in '08) and doesn't have many options. The video format options it does has are more for people dealing with black borders or 4:3 content. And I only have slight (1 or 2 letters cutoff on the madvr ODR) overscan on all 4 sides. Not really noticeable when I'm watching something but enough to make navigating Windows fun.
glc650 is offline   Reply With Quote
Old 9th January 2019, 05:23   #54172  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by tp4tissue View Post
SHowever, my basement ATI 7870xt isn't showing the measure screen nit and the highlight restoration seems to have no effect.. is it not hooking into the old gpu's direct compute ?
without knowing your setup i can only guess.

@glc650

look into the AMD driver for something similar to adjust desktop size and position that nvidia has.

this option is creating a smaller moved image not by scaling it down like an underscan but by creating a new resolution so everything stays bit perfect on the PC side which is preferable worth a shoot my fix the presentation issue.
what type of windows version are you running and can you make a screen of your OSD.
forced over scan is sadly a not unusually for older screens.
huhn is offline   Reply With Quote
Old 9th January 2019, 10:00   #54173  |  Link
glc650
Registered User
 
Join Date: May 2003
Posts: 77
Quote:
Originally Posted by huhn View Post
@glc650

look into the AMD driver for something similar to adjust desktop size and position that nvidia has.

this option is creating a smaller moved image not by scaling it down like an underscan but by creating a new resolution so everything stays bit perfect on the PC side which is preferable worth a shoot my fix the presentation issue.
what type of windows version are you running and can you make a screen of your OSD.
forced over scan is sadly a not unusually for older screens.
I'm not sure how to do this. There is a "Custom Resolutions" section in what is left of the Catalyst Control Center settings but I'm not sure what to input. I'm already running at 1920x1080 according to both Windows Display Properties and AMD CCC/Radeon settings.

edit: enabling the option in CCC made it available in Radeon Settings (see pic)
https://1drv.ms/u/s!AnsGKXR_EKR0hCYcEtxKSYFM0GeS
Attached Images
 

Last edited by glc650; 9th January 2019 at 10:10.
glc650 is offline   Reply With Quote
Old 9th January 2019, 13:40   #54174  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
I seem to be going backwards some days with the HDR side.

What seems to be the most constant nVidia drivers for toggling HDR ?

It seems it all depends on which way the wind is blowing, I did have no issues, but now it does toggle with no TV indication and I have to re-select the input on my TV for it to actually display ? I have tried various drivers, some work, then stop, I think the hard part is its mostly indications on my TV I am having issues with as it does look to go to HDR as such, as when I reslect the TV input and the HDR Signal comes on the screen, nothing changes apart from the feedback I get from the TV on colour space.
madjock is offline   Reply With Quote
Old 9th January 2019, 14:00   #54175  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by glc650 View Post
I'm not sure how to do this. There is a "Custom Resolutions" section in what is left of the Catalyst Control Center settings but I'm not sure what to input. I'm already running at 1920x1080 according to both Windows Display Properties and AMD CCC/Radeon settings.

edit: enabling the option in CCC made it available in Radeon Settings (see pic)
https://1drv.ms/u/s!AnsGKXR_EKR0hCYcEtxKSYFM0GeS
i don't use AMD cards right now i don't even know if they have something like this it would just be a little bit more disappointing then usual if they don't have this. a custom resolution may work to.

you didn't say what windows version this is but you seem to have 2 displays and this can create a lot of presentation glitches.
huhn is offline   Reply With Quote
Old 9th January 2019, 14:42   #54176  |  Link
Klaus1189
Registered User
 
Join Date: Feb 2015
Location: Bavaria
Posts: 1,666
Quote:
Originally Posted by madjock View Post
I seem to be going backwards some days with the HDR side.

What seems to be the most constant nVidia drivers for toggling HDR ?

It seems it all depends on which way the wind is blowing, I did have no issues, but now it does toggle with no TV indication and I have to re-select the input on my TV for it to actually display ? I have tried various drivers, some work, then stop, I think the hard part is its mostly indications on my TV I am having issues with as it does look to go to HDR as such, as when I reslect the TV input and the HDR Signal comes on the screen, nothing changes apart from the feedback I get from the TV on colour space.
Try 18.11.2 and do a complete uninstall of currently installed driver.
This is the first time an AMD driver makes some trouble, on Nvidia side I had already more than 5 times trouble with driver.

@all: Do you think a dedicated thread for driver versions and its issues of AMD/Intel and Nvidia would be a nice idea?
Then you don't have to look up one specific post of this monster thread.
Klaus1189 is offline   Reply With Quote
Old 9th January 2019, 14:48   #54177  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
If you wanted to maintain a thread on driver versions, go ahead. You would likely get few reports to verify if the reports are true or false for everyone.
Warner306 is offline   Reply With Quote
Old 9th January 2019, 14:54   #54178  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by Warner306 View Post
If you wanted to maintain a thread on driver versions, go ahead. You would likely get few reports to verify if the reports are true or false for everyone.
Especially when I was asking about nVidia drivers
madjock is offline   Reply With Quote
Old 9th January 2019, 16:08   #54179  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 2,323
4K HDR pixel shader, gamma settings

I remember that madshi said that 4K HDR pixel shader internally uses 2.2 gamma. Is there any "hard" rule what gamma (2.2/2.4/bt1886/etc) should be used in the actual device? Or is it like with SDR content?

Btw, here's the current state of madVR in this regard: image comparison
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config
chros is offline   Reply With Quote
Old 9th January 2019, 16:21   #54180  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the brightness level of the none compressed parts match at gamma 2.2.

so if you want the correct brightness levels you set this screen is already calibrated to your screens response where unlike SDR madVR will automatic change your "gamma" for HDR sources to 2.2.

as i said before i'm not a friend of this inconsistent behaviour between HDR and SDR.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:18.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.