Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th January 2019, 03:31   #54441  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Guys, For HDR 10bit, on nvidia 1060, Does setting 4:2:2 10bit/ 12bit on the NVIDIA control panel make a difference ? OR, does it auto pop to the correct output setting when NVHDR is engaged..


And also, does setting 10bit in the Madvr control panel cause issues for 8bit outputs ?
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 25th January 2019 at 03:47.
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 05:34   #54442  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
madVR's dithering is better than the GPU's, and you don't want to double dither if you can avoid it, so it is better to set madVR to the same bitdepth as the GPU (or lower).

For HDR I much prefer sending a display 8 bit RGB over 10 bit 4:2:2, madVR will not change the GPU output at all.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 25th January 2019, 05:55   #54443  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by Asmodian View Post
madVR's dithering is better than the GPU's, and you don't want to double dither if you can avoid it, so it is better to set madVR to the same bitdepth as the GPU (or lower).

For HDR I much prefer sending a display 8 bit RGB over 10 bit 4:2:2, madVR will not change the GPU output at all.
Does that mean if I have 4:2:2 12bit selected in Nvidia control panel,

Madvr will be sending 4:2:2 12bit ?


I'm confused as to how this works, because it's suppose to output -Limited- range, but the desktop looks exactly the same as full range.

And in madvr control panel, if i set the display to limited range, it's actually too bright.

So, How does it all line up. ?
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 08:00   #54444  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
this is the 3rd time i tell you that madVR will always output RGB to the GPU driver with either 10 or 8 bit no matter what if the GPU can send it is nothing for madVR to be bothered with.

the GPU driver always expects the input RGB to be full range.
and sending YCbCr is always limited range only RGB has too ranges.

so yes the GPU driver is doing the conversation not madVR.
huhn is offline   Reply With Quote
Old 25th January 2019, 08:06   #54445  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by huhn View Post
this is the 3rd time i tell you that madVR will always output RGB to the GPU driver with either 10 or 8 bit no matter what if the GPU can send it is nothing for madVR to be bothered with.

the GPU driver always expects the input RGB to be full range.
and sending YCbCr is always limited range only RGB has too ranges.

so yes the GPU driver is doing the conversation not madVR.

Huhn, I still do not understand this entire chain clearly.

Is there a diagram for where the conversions happen end to end.


Not so much that I don't understand what you're saying, but I can't seem to confirm the difference in end result visually.


For example, if I set the NvidiaCP to RGB 8bit, FULL range, When NVHDR kicks in, my Display only tells me it's HDR10.. Madvr top line says RGB 8bit

But how could RGB be 10 bit if it's set to 8.


How do I know for sure, what the gpu is outputing.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 25th January 2019 at 08:29.
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 08:21   #54446  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by tp4tissue View Post
Huhn, I still do not understand this entire chain clearly.
https://forum.kodi.tv/showthread.php?tid=259188

Lots of info here.
madjock is offline   Reply With Quote
Old 25th January 2019, 09:11   #54447  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by tp4tissue View Post
Is there a diagram for where the conversions happen end to end.
There are three places where conversions can happen before the display and they do not interact. The display can be doing conversions internally too, but we cannot control these.

1) LAV video can do a conversion if the render does not support the source format. When using madVR this should never be needed and it is not ideal.
2) madVR will output RGB at whatever bitdepth is set.
3) The GPU driver will do a conversion to whatever you set in its control panel.

madVR will never change what it outputs based on what the GPU is set to and the GPU will never change its output based on what madVR sends it. All these steps simply convert any input to their configured output format without a lot of smarts involved. For example the GPU is happy to convert madVR's 8 bit output to 10 bit. You know for sure what the GPU is outputting based on what it is set to in its drivers.

Also, I can send my TV 8 bit RGB HDR and it will say HDR10, that is just a brand label for the metadata standard, it works with 8 bit RGB too.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 25th January 2019, 09:15   #54448  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by Warner306 View Post
They are from a free HDR10 test pattern set. The OSD should confirm. Black clipping is not really HDR because the peak of the one mentioned is less than 1 nit, but that is intended.
Yes but there are no numbers on this ?

I did try the HDR set you mention, but I could not get the clipping ones to work, or rather get my TV to adjust once in HDR mode, unsure if this was me, or that adjusting for black crush is broken for me on my TV as I read is common in HDR mode.
madjock is offline   Reply With Quote
Old 25th January 2019, 10:41   #54449  |  Link
HillieSan
Registered User
 
Join Date: Sep 2016
Posts: 176
Quote:
Originally Posted by Alexkral View Post
Thanks huhn, I think this helped a bit but didn't fixed it. Anyway I think I found the problem. I noticed that this only happened with SVP and very high bitrate videos (and with VR), so I downloaded a high bitrate 4k 60fps video and got framedrops with both CUVID an DXVA2. Then I blocked ffdshow raw and this fixed it for both modes. So the problem is to have ffdshow raw in the chain, but obviously it's needed for SVP. Also I noticed that you can select D3D11 but it doesn't work on Windows 7, I'll have to check what I'm missing.

@nevcairiel

Yes, I discovered this recently, but anyway I'm currently not using madVR's HDR to SDR because SVP drops the needed metadata. I managed to find some alternatives however, still using both SVP and madVR, but without dynamic tonemapping (though I have some ideas also for this). I wouldn't say that normal tonemapping is much worse than dynamic, not so long ago it was the only thing available and everybody was more than happy with it. Anyway I'm not very sure how madVR is doing dynamic tonemapping with BT.2390, because only changing Mastering display white level gives me different results, and I don't see other configurable parameters for this.
Stop using SVP and set the right frequency of your display and your graphics card. This is my experience and it works well.
HillieSan is offline   Reply With Quote
Old 25th January 2019, 11:24   #54450  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Quote:
Originally Posted by chros View Post
For whatever reason with Adaptive setting it often dropped the performance state and quickly raised it back (monitored with nvidiainspector), hence it produced bunch of dropped frames. Why? That's a good question, because it worked before it with the same setting.
I can say that's not the experience I have; on my system Adaptive always sets the clocks sufficiently high that the GPU load never reaches too high and frames are dropped, and it keeps them stable.
I see you are using 385.28 so I presume you have specific reasons for that, but maybe another driver version would help with this?
Also, it may be an option to use the High Performance power setting if you have a dedicated HTPC that doesn't see long idle times.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 25th January 2019, 13:37   #54451  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by Asmodian View Post
There are three places where conversions can happen before the display and they do not interact. The display can be doing conversions internally too, but we cannot control these.

1) LAV video can do a conversion if the render does not support the source format. When using madVR this should never be needed and it is not ideal.
2) madVR will output RGB at whatever bitdepth is set.
3) The GPU driver will do a conversion to whatever you set in its control panel.

madVR will never change what it outputs based on what the GPU is set to and the GPU will never change its output based on what madVR sends it. All these steps simply convert any input to their configured output format without a lot of smarts involved. For example the GPU is happy to convert madVR's 8 bit output to 10 bit. You know for sure what the GPU is outputting based on what it is set to in its drivers.

Also, I can send my TV 8 bit RGB HDR and it will say HDR10, that is just a brand label for the metadata standard, it works with 8 bit RGB too.
So if the NvidiaCP is on 8 bit rgb, and the TV is popped into NVHDR, MADVR set to 10 bit.

What am I seeing, still 8 bit HDR ?
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 13:40   #54452  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
Quote:
Originally Posted by tp4tissue View Post
So if the NvidiaCP is on 8 bit rgb, and the TV is popped into NVHDR, MADVR set to 10 bit.

What am I seeing, still 8 bit HDR ?
8bit. Whatever the GPU output format is set to to is what your TV is receiving (the first line in madVR stats will tell you)
iSeries is offline   Reply With Quote
Old 25th January 2019, 13:41   #54453  |  Link
madjock
Registered User
 
Join Date: May 2018
Posts: 259
Quote:
Originally Posted by tp4tissue View Post
So if the NvidiaCP is on 8 bit rgb, and the TV is popped into NVHDR, MADVR set to 10 bit.

What am I seeing, still 8 bit HDR ?
Yes. You can get nVidia to 12Bit either easy or hard depending what drivers you use, whether that does anything you can see is another matter.
madjock is offline   Reply With Quote
Old 25th January 2019, 13:52   #54454  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
Quote:
Originally Posted by madjock View Post
whether that does anything you can see is another matter.
Indeed, and may actually do more harm than good anyway - definitely for my TV there is more banding at 12bit than 8bit (LG C8).
iSeries is offline   Reply With Quote
Old 25th January 2019, 15:54   #54455  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by iSeries View Post
8bit. Whatever the GPU output format is set to to is what your TV is receiving (the first line in madVR stats will tell you)
Quote:
Originally Posted by madjock View Post
Yes. You can get nVidia to 12Bit either easy or hard depending what drivers you use, whether that does anything you can see is another matter.
Quote:
Originally Posted by iSeries View Post
Indeed, and may actually do more harm than good anyway - definitely for my TV there is more banding at 12bit than 8bit (LG C8).


OK guys, tahnx, I am understands now.. My TV has been Lying to me, that cheatn' whore..


Which driver are y'all guys using for 12 bit ? (nvidia)


Does 10 bit work on all drivers ? (nvidia)
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 25th January 2019 at 15:57.
tp4tissue is offline   Reply With Quote
Old 25th January 2019, 16:57   #54456  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
Quote:
Originally Posted by tp4tissue View Post
OK guys, tahnx, I am understands now.. My TV has been Lying to me, that cheatn' whore..


Which driver are y'all guys using for 12 bit ? (nvidia)


Does 10 bit work on all drivers ? (nvidia)
385.28 probably works easiest if you are using a custom resolution. With later drivers (not sure when it started), madVR could only create 8bit custom resolutions, which means needing CRU to create it. I found 416.35 to work great if not using a custom resolution (or if creating an 8bit custom res with madVR, or creating a 12bit custom res with CRU). I believe the latest driver will only kick into HDR if madVR sends 10bit, regardless of whether the GPU driver is set to 12bit or 8bit.

You should look and carefully compare 12bit vs 8bit. Many TVs will show colour banding with 12bit input which isn't there when given 8bit.

Last edited by iSeries; 25th January 2019 at 17:40.
iSeries is offline   Reply With Quote
Old 25th January 2019, 17:24   #54457  |  Link
Alexkral
Registered User
 
Join Date: Oct 2018
Posts: 319
Quote:
Originally Posted by HillieSan View Post
Stop using SVP and set the right frequency of your display and your graphics card. This is my experience and it works well.
The fact is that I really like the enhanced temporal resolution. It feels a bit weird at start, but once you get used to it you really appreciate the inprovement (though I admit that for some content it will always look too weird). I have even modified the base avisynth script to get rid of most artifacts.
For HDR that's a problem that you don't know very well where to start. Obviously it's not very reasonable to ask madshi to support tonemapping for input that doesn't identify itself as HDR, and ffdshow is no longer being developed, so there's not an easy solution. I'm testing now DmitriRender and it seems better than SVP in both performance and visual quality. Unfortunately it doesn't support HDR metadata passthrough either, but at least the developer says he's working on it.
Alexkral is offline   Reply With Quote
Old 25th January 2019, 17:41   #54458  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
just frame match and let the TV do it.

doing 5/1 instead of 5/2 interpolation is massively better anyway even through i would use neither.
huhn is offline   Reply With Quote
Old 25th January 2019, 17:48   #54459  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Dogway View Post
Just realised that NGU causes quite a lot of coil whine, even when using it only for chroma. I bought a 1070 in black friday and didn't notice until I enabled NGU for my monitor the other day. What puzzled me is that I do GPU intense work and never had coil whine until this, I did a comparison rendering with GPU with Redshift and GPU-Z report is the same than with NGU, with further testings I found that the trigger was Memory Controller Load, at 25% it makes the buzz noise below or above doesn't, the problem is that NGU very high (above 25%) heats my card like a toaster (running it in a mATX case)
Coil whine is defect of the fan, so NGU can exasperate it, but not cause it. Using very high can push some GPUs into overdrive.
Warner306 is offline   Reply With Quote
Old 25th January 2019, 17:51   #54460  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by madjock View Post
Yes but there are no numbers on this ?

I did try the HDR set you mention, but I could not get the clipping ones to work, or rather get my TV to adjust once in HDR mode, unsure if this was me, or that adjusting for black crush is broken for me on my TV as I read is common in HDR mode.
The numbers are at the top and bottom of the pattern. That pattern is really only for those with 8-bit displays. It is not a great pattern, but it is the only 8-bit HDR10 black clipping pattern I could find. I was able to calibrate black clipping by using it.

It is mostly useful to select the output SDR gamma curve for HDR -> SDR. With most HDR displays, it is not advisable to change the brightness or contrast controls because you can offset the display's tone mapping, which is based on the default settings for both.
Warner306 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:30.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.