Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 24th January 2019, 18:28   #54441  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,959
so your card get's slower with more aggressive performance?
huhn is offline   Reply With Quote
Old 24th January 2019, 18:45   #54442  |  Link
chros
Registered User
 
chros's Avatar
 
Join Date: Mar 2002
Posts: 1,502
Quote:
Originally Posted by huhn View Post
so your card get's slower with more aggressive performance?
For whatever reason with Adaptive setting it often dropped the performance state and quickly raised it back (monitored with nvidiainspector), hence it produced bunch of dropped frames. Why? That's a good question, because it worked before it with the same setting.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v385.28),Win10 LTSB 1607,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED65B8(2160p@23/24/25/29/30/50/59/60Hz)
chros is offline   Reply With Quote
Old 24th January 2019, 18:46   #54443  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 223
Quote:
Originally Posted by Warner306 View Post
Try using these HDR10 black clipping patterns to test black clipping .
These are not HDR ? Think they are just 8-10bit panel checkers ?

Last edited by madjock; 24th January 2019 at 18:56.
madjock is offline   Reply With Quote
Old 24th January 2019, 18:57   #54444  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,699
Quote:
Originally Posted by Alexkral View Post
At least in my system (Windows 7, GTX 1080) I get framedrops with DXVA and D3D11 when playing 4k and up @ 60 fps, while CUVID runs smooth.
I had totally forgotten to mention setting Nvidia's power management mode to adaptive (in 3D settings). Optimal power often causes issues.

Quote:
Originally Posted by chros View Post
That's what I thought as well, until 2 weeks ago I had to set Optimal back in NCP because I had massive drops with Adaptive. Now it seem to work fine. (Setup is in my signature.)
I would redo that test... because setting optimal power definitely causes the GPU to clock lower more often. Did you have temperature issues or something?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2019, 19:33   #54445  |  Link
Soxbrother
Registered User
 
Join Date: Jun 2017
Posts: 9
Quote:
Originally Posted by tp4tissue View Post
HD 4400 benches 567 on passmark.

You will have to turn off many of the madvr's features in order to scale 1920x1080 into 3840x2160.

Madvr will still give you more color accuracy, but to use higher quality filters like Lanczos, there's simply not enough processing power to do it.

If you scale 1080p to 1080p, then that is chroma scaling only, 1:1 luma. the 4400 will do this perfectly fine. You can enable Jinc or Lanczos for chroma.
What about image doubling, would that require less juice than upscaling ?

Quote:
Originally Posted by tp4tissue View Post
I believe you also need displayport to hdmi 2.0 adapter, because 4400's native hdmi is probably not HDMI 2.0, which is required for 3840x2160 60hz
So you can convert the DisplayPort version 1.2 to Hdmi 2.0 or 2.0a ?

If this works, how about my Dolby Atmos audio ?
Would my Dolby Atmos audio work through such an adapter ?

Quote:
Originally Posted by tp4tissue View Post
Also, run CPU decode on Lav, do not use dxva, because you want to save as much GPU time as possible for Madvr.

Letting the CPU decode the first part is fine.
Quote:
Originally Posted by chros View Post
This is not how GPU hardware acceleration works: it's a completely separate pipeline. Just use GPU hardware acceleration all the time if you can.
Thanks guys.

At the moment I believe the screen is set at 24hz.
I can go up to 30hz.
The higher the better ?
Should I set it to 30hz ?

Would 60hz with the adapter give me better quality and smoother playback ?

Thanks in advance.
Soxbrother is offline   Reply With Quote
Old 24th January 2019, 20:53   #54446  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,699
Quote:
Originally Posted by Soxbrother View Post
What about image doubling, would that require less juice than upscaling ?
No, it needs more.

Quote:
Originally Posted by Soxbrother View Post
So you can convert the DisplayPort version 1.2 to Hdmi 2.0 or 2.0a ?
This would need to be an active adapter because DP and HDMI don't use the same signaling. They are expensive and I expect that most (all?) don't support Atmos.

Quote:
Originally Posted by Soxbrother View Post
At the moment I believe the screen is set at 24hz.
I can go up to 30hz.
The higher the better ?
Should I set it to 30hz ?

Would 60hz with the adapter give me better quality and smoother playback ?
This is somewhat complicated and depends on the source frame rate. If you are using smooth motion the higher the better (I would say 60 Hz is the minimum for decent smooth motion) but even better than smooth motion is using a refresh rate that matches the source frame rate. This means that you need to tune your refresh rate so you never get dropped or repeated frames. This can be tricky but madVR has a custom resolution tool that helps tune the refresh rate.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th January 2019, 23:28   #54447  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,126
Quote:
Originally Posted by madjock View Post
These are not HDR ? Think they are just 8-10bit panel checkers ?
They are from a free HDR10 test pattern set. The OSD should confirm. Black clipping is not really HDR because the peak of the one mentioned is less than 1 nit, but that is intended.
Warner306 is offline   Reply With Quote
Old 25th January 2019, 01:02   #54448  |  Link
Alexkral
Registered User
 
Join Date: Oct 2018
Posts: 131
Quote:
Originally Posted by huhn View Post
so go to the NCP and change the power setting to adaptive or maximum performance and try again.
Thanks huhn, I think this helped a bit but didn't fixed it. Anyway I think I found the problem. I noticed that this only happened with SVP and very high bitrate videos (and with VR), so I downloaded a high bitrate 4k 60fps video and got framedrops with both CUVID an DXVA2. Then I blocked ffdshow raw and this fixed it for both modes. So the problem is to have ffdshow raw in the chain, but obviously it's needed for SVP. Also I noticed that you can select D3D11 but it doesn't work on Windows 7, I'll have to check what I'm missing.

@nevcairiel

Yes, I discovered this recently, but anyway I'm currently not using madVR's HDR to SDR because SVP drops the needed metadata. I managed to find some alternatives however, still using both SVP and madVR, but without dynamic tonemapping (though I have some ideas also for this). I wouldn't say that normal tonemapping is much worse than dynamic, not so long ago it was the only thing available and everybody was more than happy with it. Anyway I'm not very sure how madVR is doing dynamic tonemapping with BT.2390, because only changing Mastering display white level gives me different results, and I don't see other configurable parameters for this.
Alexkral is offline   Reply With Quote
Old 25th January 2019, 02:13   #54449  |  Link
Dogway
Registered User
 
Join Date: Nov 2009
Posts: 1,009
Just realised that NGU causes quite a lot of coil whine, even when using it only for chroma. I bought a 1070 in black friday and didn't notice until I enabled NGU for my monitor the other day. What puzzled me is that I do GPU intense work and never had coil whine until this, I did a comparison rendering with GPU with Redshift and GPU-Z report is the same than with NGU, with further testings I found that the trigger was Memory Controller Load, at 25% it makes the buzz noise below or above doesn't, the problem is that NGU very high (above 25%) heats my card like a toaster (running it in a mATX case)
Dogway is offline   Reply With Quote
Old 25th January 2019, 03:30   #54450  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by Dogway View Post
Just realised that NGU causes quite a lot of coil whine, even when using it only for chroma. I bought a 1070 in black friday and didn't notice until I enabled NGU for my monitor the other day. What puzzled me is that I do GPU intense work and never had coil whine until this, I did a comparison rendering with GPU with Redshift and GPU-Z report is the same than with NGU, with further testings I found that the trigger was Memory Controller Load, at 25% it makes the buzz noise below or above doesn't, the problem is that NGU very high (above 25%) heats my card like a toaster (running it in a mATX case)
That's how you know it's Lookin' Guud
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 03:31   #54451  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Guys, For HDR 10bit, on nvidia 1060, Does setting 4:2:2 10bit/ 12bit on the NVIDIA control panel make a difference ? OR, does it auto pop to the correct output setting when NVHDR is engaged..


And also, does setting 10bit in the Madvr control panel cause issues for 8bit outputs ?
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 25th January 2019 at 03:47.
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 05:34   #54452  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,699
madVR's dithering is better than the GPU's, and you don't want to double dither if you can avoid it, so it is better to set madVR to the same bitdepth as the GPU (or lower).

For HDR I much prefer sending a display 8 bit RGB over 10 bit 4:2:2, madVR will not change the GPU output at all.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 25th January 2019, 05:55   #54453  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by Asmodian View Post
madVR's dithering is better than the GPU's, and you don't want to double dither if you can avoid it, so it is better to set madVR to the same bitdepth as the GPU (or lower).

For HDR I much prefer sending a display 8 bit RGB over 10 bit 4:2:2, madVR will not change the GPU output at all.
Does that mean if I have 4:2:2 12bit selected in Nvidia control panel,

Madvr will be sending 4:2:2 12bit ?


I'm confused as to how this works, because it's suppose to output -Limited- range, but the desktop looks exactly the same as full range.

And in madvr control panel, if i set the display to limited range, it's actually too bright.

So, How does it all line up. ?
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 08:00   #54454  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,959
this is the 3rd time i tell you that madVR will always output RGB to the GPU driver with either 10 or 8 bit no matter what if the GPU can send it is nothing for madVR to be bothered with.

the GPU driver always expects the input RGB to be full range.
and sending YCbCr is always limited range only RGB has too ranges.

so yes the GPU driver is doing the conversation not madVR.
huhn is offline   Reply With Quote
Old 25th January 2019, 08:06   #54455  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 408
Quote:
Originally Posted by huhn View Post
this is the 3rd time i tell you that madVR will always output RGB to the GPU driver with either 10 or 8 bit no matter what if the GPU can send it is nothing for madVR to be bothered with.

the GPU driver always expects the input RGB to be full range.
and sending YCbCr is always limited range only RGB has too ranges.

so yes the GPU driver is doing the conversation not madVR.

Huhn, I still do not understand this entire chain clearly.

Is there a diagram for where the conversions happen end to end.


Not so much that I don't understand what you're saying, but I can't seem to confirm the difference in end result visually.


For example, if I set the NvidiaCP to RGB 8bit, FULL range, When NVHDR kicks in, my Display only tells me it's HDR10.. Madvr top line says RGB 8bit

But how could RGB be 10 bit if it's set to 8.


How do I know for sure, what the gpu is outputing.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 25th January 2019 at 08:29.
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 08:21   #54456  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 223
Quote:
Originally Posted by tp4tissue View Post
Huhn, I still do not understand this entire chain clearly.
https://forum.kodi.tv/showthread.php?tid=259188

Lots of info here.
madjock is offline   Reply With Quote
Old 25th January 2019, 09:11   #54457  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 3,699
Quote:
Originally Posted by tp4tissue View Post
Is there a diagram for where the conversions happen end to end.
There are three places where conversions can happen before the display and they do not interact. The display can be doing conversions internally too, but we cannot control these.

1) LAV video can do a conversion if the render does not support the source format. When using madVR this should never be needed and it is not ideal.
2) madVR will output RGB at whatever bitdepth is set.
3) The GPU driver will do a conversion to whatever you set in its control panel.

madVR will never change what it outputs based on what the GPU is set to and the GPU will never change its output based on what madVR sends it. All these steps simply convert any input to their configured output format without a lot of smarts involved. For example the GPU is happy to convert madVR's 8 bit output to 10 bit. You know for sure what the GPU is outputting based on what it is set to in its drivers.

Also, I can send my TV 8 bit RGB HDR and it will say HDR10, that is just a brand label for the metadata standard, it works with 8 bit RGB too.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 25th January 2019, 09:15   #54458  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 223
Quote:
Originally Posted by Warner306 View Post
They are from a free HDR10 test pattern set. The OSD should confirm. Black clipping is not really HDR because the peak of the one mentioned is less than 1 nit, but that is intended.
Yes but there are no numbers on this ?

I did try the HDR set you mention, but I could not get the clipping ones to work, or rather get my TV to adjust once in HDR mode, unsure if this was me, or that adjusting for black crush is broken for me on my TV as I read is common in HDR mode.
madjock is offline   Reply With Quote
Old 25th January 2019, 10:41   #54459  |  Link
HillieSan
Registered User
 
Join Date: Sep 2016
Posts: 125
Quote:
Originally Posted by Alexkral View Post
Thanks huhn, I think this helped a bit but didn't fixed it. Anyway I think I found the problem. I noticed that this only happened with SVP and very high bitrate videos (and with VR), so I downloaded a high bitrate 4k 60fps video and got framedrops with both CUVID an DXVA2. Then I blocked ffdshow raw and this fixed it for both modes. So the problem is to have ffdshow raw in the chain, but obviously it's needed for SVP. Also I noticed that you can select D3D11 but it doesn't work on Windows 7, I'll have to check what I'm missing.

@nevcairiel

Yes, I discovered this recently, but anyway I'm currently not using madVR's HDR to SDR because SVP drops the needed metadata. I managed to find some alternatives however, still using both SVP and madVR, but without dynamic tonemapping (though I have some ideas also for this). I wouldn't say that normal tonemapping is much worse than dynamic, not so long ago it was the only thing available and everybody was more than happy with it. Anyway I'm not very sure how madVR is doing dynamic tonemapping with BT.2390, because only changing Mastering display white level gives me different results, and I don't see other configurable parameters for this.
Stop using SVP and set the right frequency of your display and your graphics card. This is my experience and it works well.
HillieSan is offline   Reply With Quote
Old 25th January 2019, 11:24   #54460  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 546
Quote:
Originally Posted by chros View Post
For whatever reason with Adaptive setting it often dropped the performance state and quickly raised it back (monitored with nvidiainspector), hence it produced bunch of dropped frames. Why? That's a good question, because it worked before it with the same setting.
I can say that's not the experience I have; on my system Adaptive always sets the clocks sufficiently high that the GPU load never reaches too high and frames are dropped, and it keeps them stable.
I see you are using 385.28 so I presume you have specific reasons for that, but maybe another driver version would help with this?
Also, it may be an option to use the High Performance power setting if you have a dedicated HTPC that doesn't see long idle times.
__________________
HTPC: Windows 10 1809, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti
el Filou is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:08.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.