Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 29th October 2015, 17:34   #33961  |  Link
Sorrigotti
Registered User
 
Join Date: Apr 2015
Posts: 10
Quote:
Originally Posted by Warner306 View Post
The post on the previous page describes this issue. madVR has a dummy profile because of the option you have selected in DSPlayer.

You have three options when using Kodi DSPlayer (only two are important):

Video -> DSPlayer -> - Manage settings with Kodi

Never: The Kodi madVR GUI is disabled and all madVR settings are loaded exclusively from the madVR control panel.

Load and Save with DSPlayer database: Enabling this option allows basic configuration of madVR from within Kodi. These settings are accessed during video playback by selecting Video Settings. Video settings are set on a per video basis but can be saved as a global profile for all videos. DSPlayer will create its own DSPlayer Profile Group in the madVR control panel when this setting is enabled. This is a dummy profile intended to separate DSPlayer settings from existing madVR profiles. This dummy profile is not meant for external configuration. Internal settings tables are saved by Kodi – no settings are saved in the madVR control panel.

Load from madVR active profile: Upon playback, the appropriate profile is loaded from madVR for the selected video. Changes made from within Kodi will change the same value in the madVR control panel. No dummy profile is created. These changes only impact the active madVR profile and profile rules cannot be created or saved. This is the best choice for those who have set-up profiles in madVR. Any changes made to the active profile are saved externally by madVR. External settings are always loaded in place of internal Kodi tables.
Quote:
Originally Posted by Razoola View Post
@sorrigotti, listen to what Warner306 and Asmodian have just told you and change the kodi setting to either 'Never' or 'Load from madVR active profile', I suggest this 2nd option for you.

That is the cause of your issue 100% given you said the suggestions worked for MPC-HC.

Warner 306, Razoola and Asmodian thanks for the help!

Problem solved.

I discovered yesterday morning in the forum XBMC BRAZIL that could change the configs of madVR via KODI interface.

I came here to be able to explain how and now saw the post Warner306 detailing the same procedure.

Once again I thank the attention and elp of all!

thank you.
Sorrigotti is offline   Reply With Quote
Old 29th October 2015, 18:59   #33962  |  Link
XMonarchY
Guest
 
Posts: n/a
Could someone please remind me which DXVA setting I should use in LAV Video? Copy-Back? I used to use CUVID until I found out it can cause problems and errors...
  Reply With Quote
Old 29th October 2015, 19:16   #33963  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
copyback or none.
huhn is offline   Reply With Quote
Old 29th October 2015, 22:17   #33964  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Can anyone comment on the usefulness of 10-bit output with current content? I have a 10-bit display but no 10-bit sources.

I am finding this topic confusing and would like some clarification. I am told the remapping of an 8-bit source to 10-bits will lead to an inaccurate color gamut and no gain in image quality.
Warner306 is offline   Reply With Quote
Old 29th October 2015, 22:54   #33965  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Warner306 View Post
Can anyone comment on the usefulness of 10-bit output with current content? I have a 10-bit display but no 10-bit sources.

I am finding this topic confusing and would like some clarification. I am told the remapping of an 8-bit source to 10-bits will lead to an inaccurate color gamut and no gain in image quality.
On a normal display (contrast < 5000:1) using a normal gamut (BT.709, BT.601, etc.) 10-bit offers basically no visual improvement over dithered 8-bit. Even displays with higher contrast or using wider gamuts will not benefit significantly. Lower dither noise is the benefit from going to 10-bit.

8 or 10-bit has nothing to do with the accuracy of the color gamut. The color gamut is calculated in 16-bit and dithered to 8 or 10 bit at a final output step. Actually more accuracy is lost going to 8-bit but this difference is extremely minor and not even measurable due to the dithering.

If I did have a 10-bit display I would still use 8-bit output because I like using overlay more than FSE.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 29th October 2015, 23:47   #33966  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by Asmodian View Post
On a normal display (contrast < 5000:1) using a normal gamut (BT.709, BT.601, etc.) 10-bit offers basically no visual improvement over dithered 8-bit. Even displays with higher contrast or using wider gamuts will not benefit significantly. Lower dither noise is the benefit from going to 10-bit.

8 or 10-bit has nothing to do with the accuracy of the color gamut. The color gamut is calculated in 16-bit and dithered to 8 or 10 bit at a final output step. Actually more accuracy is lost going to 8-bit but this difference is extremely minor and not even measurable due to the dithering.

If I did have a 10-bit display I would still use 8-bit output because I like using overlay more than FSE.
I see. Can you comment on this article:

http://www.soundandvision.com/conten...7bcb6kc7d2f.97

It claims TVs are currently incapable of display Rec. 2020. I thought 10-bit color and Rec. 2020 were one and the same, but it could be 10-bit output is only one of the specifications as stated by Wikipedia.
Warner306 is offline   Reply With Quote
Old 29th October 2015, 23:53   #33967  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
10-bit output and BT.2020 are not strictly related, although for BT.2020 to work properly, you need at least 10-bit. But you can have BT.709 and use 10-bit as well.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 30th October 2015, 07:23   #33968  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Warner306 View Post
It claims TVs are currently incapable of display Rec. 2020. I thought 10-bit color and Rec. 2020 were one and the same, but it could be 10-bit output is only one of the specifications as stated by Wikipedia.
Rec. 2020 requires 10-bit for decent quality (no banding) but 10-bit doesn't mean Rec. 2020 at all. Rec. 2020's gamut is defined by its primary colors, not the bit-depth, 100% green in Rec. 2020 is a much more saturated green than 100% green in BT.709. This means that Rec. 2020's green primary stimulates the red and blue receptors in the eye less than BT.709's.

Quote:
Originally Posted by nevcairiel View Post
10-bit output and BT.2020 are not strictly related, although for BT.2020 to work properly, you need at least 10-bit. But you can have BT.709 and use 10-bit as well.
They aren't related at all are they? 10-bit is simply using 10 bits to represent 0 (0) to 1 (1023). 1 (1023) represents full saturation of a primary color in the source or target gamut with 10-bit, the same as 1 (255) does with 8-bit.

You could output to Rec. 2020's gamut using 8-bit (or 1-bit), the only reason 10-bit is required by the spec is because banding would be very bad with only 256 steps due to the huge dynamic range. Isn't even 10-bit insufficient to avoid banding with Rec. 2020's increased luma range and the specs suggest/require 12-bit for that?
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 30th October 2015, 08:45   #33969  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
As I understand it using 10bit with madVR as things currently stand means you do not have to use madVR's anti banding algorithms. Although my panel supports 10bit, my receiver does not so I have not investigated myself to be honest. Using 10 bit has nothing to do with the new colorspace spec (2020) currently in madVR.
Razoola is offline   Reply With Quote
Old 30th October 2015, 12:00   #33970  |  Link
6233638
Registered User
 
Join Date: Apr 2009
Posts: 1,019
Quote:
Originally Posted by Razoola View Post
As I understand it using 10bit with madVR as things currently stand means you do not have to use madVR's anti banding algorithms. Although my panel supports 10bit, my receiver does not so I have not investigated myself to be honest. Using 10 bit has nothing to do with the new colorspace spec (2020) currently in madVR.
No - content should benefit more from debanding, since it is encoded with 256 steps (8-bit) and debanding can increase that to 1024 steps (10-bit) or more.
Using a 10-bit output means that dither is less visible.
6233638 is offline   Reply With Quote
Old 30th October 2015, 12:34   #33971  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
SDR bt 2020 should be fine with 8 bit too.

bt 709 kind of works with 7 and 6 bit too if dithered properly.

so luma doesn't lose anything compared to bt 709 and PQ gamma should even help and make it better. and the chroma channel should survive with 8 bit too only green is a lot bigger.

HDR BT 2020 is of cause a whole different story.
huhn is offline   Reply With Quote
Old 30th October 2015, 12:40   #33972  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by huhn View Post
SDR bt 2020 should be fine with 8 bit too.
Not really, no. Full precision would require 12-bits even, and 10-bit dithered is probably going to be just fine. But 8-bit will easily band or require high dithering noise.

Unfortunately full testing will require much better screens than we have today. High-end screens can barely reach P3 color space, and are still far from full BT.2020 coverage.
Not to mention that the BT.2020 specification calls for 10 or 12 bits..

HDR cannot really be compared directly, since it uses entirely different gamma curves, but HDR will definitely not work properly on lower bitdepth, thats for sure.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 30th October 2015 at 12:47.
nevcairiel is offline   Reply With Quote
Old 30th October 2015, 13:15   #33973  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the 12 bit in bt 2020 comes from HDR.

with 100 CM² in mind the steps doesn't really increase in bt 2020 with 8 bit compared to 7 bit Bt 709.

only green got a lot bigger about 100 % so 1 more bit is needed to get the same number of steps between each color. but blue and red are only ~5-15 % further away from the white point.

so to get the same quality as 7 bit Bt 709 shouldn't be a problem with 8 bit.
huhn is offline   Reply With Quote
Old 30th October 2015, 13:40   #33974  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
There are people much smarter than you and me that clearly mandate 10-bit for BT.2020. You are free to use it in 8-bit if you want, but I certainly won't.
It is correct that BT.2020 strictly needs 1 more bit than BT.709, but arguably 8-bit was not quite enough for BT.709 either. Visual tests by experts have confirmed that 11-bit is needed to cover the full visual spectrum of BT.2020.

But its all theoretical at this point still, consumer displays with such a high gamut are still far out. Hopefully next years TVs will at least get closer.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 30th October 2015 at 13:45.
nevcairiel is offline   Reply With Quote
Old 30th October 2015, 19:07   #33975  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by nevcairiel View Post
There are people much smarter than you and me that clearly mandate 10-bit for BT.2020. You are free to use it in 8-bit if you want, but I certainly won't.
It is correct that BT.2020 strictly needs 1 more bit than BT.709, but arguably 8-bit was not quite enough for BT.709 either. Visual tests by experts have confirmed that 11-bit is needed to cover the full visual spectrum of BT.2020.

But its all theoretical at this point still, consumer displays with such a high gamut are still far out. Hopefully next years TVs will at least get closer.
That is all assuming no-dithering though isn't it? Given how good dithering to 4 or 5-bit looks with BT.709 I can imaging madVR doing 4K Rec. 2020 quite well with only 8-bit. Not optimal but still happily watchable.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 30th October 2015, 19:54   #33976  |  Link
Arm3nian
Registered User
 
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
Is 4K bluray supposed to use Rec. 2020? I think so.
But what's the point if there is no display with that gamut. Plus, how would you make such a high gamut display and not make everything else that is standardized look like crap? Would need dynamic LUTs or something.

Last edited by Arm3nian; 30th October 2015 at 20:11.
Arm3nian is offline   Reply With Quote
Old 30th October 2015, 19:58   #33977  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 589
Quote:
Originally Posted by Asmodian View Post
I can imaging madVR doing 4K Rec. 2020 quite well with only 8-bit.
I don't think it works that way. The dithering needs to be done in the source itself, which is typically not the case (I'm guessing for compression efficiency reasons). If you play an undithered 8-bit file and add dithering in the rendering chain, it will make each 8-bit step more accurate, but it won't magically recover the missing steps from the source.

Last edited by e-t172; 30th October 2015 at 21:46.
e-t172 is offline   Reply With Quote
Old 30th October 2015, 20:24   #33978  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Quote:
Originally Posted by Arm3nian View Post
Is 4K bluray supposed to use Rec. 2020? I think so.
But what's the point if there is no display with that gamut. Plus, how would you make such a high gamut display and not make everything else that is standardized look like crap? Would need dynamic LUTs or something.
Future Blu-rays may also not use the entire full spectrum either, since movies right now are mostly mastered in DCI-P3 anyway, and in 2-3 years there will be displays which can support it.
And since the new spectrum is larger, you can still show any old content within the new spectrum, and it'll look like it always did - assuming the renderer knows what its doing, and hopefully madVR does.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 30th October 2015 at 20:48.
nevcairiel is offline   Reply With Quote
Old 30th October 2015, 23:08   #33979  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
I'm now clear on bit depths. Thanks for the discussion.
Warner306 is offline   Reply With Quote
Old 31st October 2015, 04:48   #33980  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by e-t172 View Post
I don't think it works that way. The dithering needs to be done in the source itself, which is typically not the case (I'm guessing for compression efficiency reasons). If you play an undithered 8-bit file and add dithering in the rendering chain, it will make each 8-bit step more accurate, but it won't magically recover the missing steps from the source.
You are correct. I should have said 8-bit output, that is what I meant (like <8-bit BT.709 today). The source absolutely must be 10-bit or higher to store useful Rec. 2020 video; dithering in the source would not survive HVEC encoding well and is inefficient. The specs for the source format and the display format do not need to be the same, at least when using a sophisticated dithering video renderer like madVR.
__________________
madVR options explained

Last edited by Asmodian; 31st October 2015 at 04:51.
Asmodian is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:03.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.