Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 11th October 2017, 16:58   #46461  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,340
Quote:
Originally Posted by bitterman View Post
I'm not really familiar with the term "Hybrid decoding", but I assume you mean that with Kepler cards it's best to just untick HEVC in LAV video?
Yes. From LAVs side its unfortunately not really possible to know if its a full hardware decoder or a Hybrid decoder, the driver exposes them the same way.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 11th October 2017, 17:00   #46462  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by videobruce View Post
I did a search for "Light Alloy" in this thread and only found one hit. They (LA) claim it works with this software, but on madvr's home page, Light Alloy Player is not listed. There are at least two issues with it, but it may be LA's software. Anyone else use LA here?
You did not find the madVR home page.. that page is not madshi's. Sorry, not familiar with LA.

Quote:
Originally Posted by Razoola View Post
Can somone please explain to me what happens in a situation where windows (win7) only allows one to set 12bit HDMI output in its drivers control panel (nvidia) , madVR only have a setting of 10bit or greater and D3D11 10bit?
Your GPU adds two 0 bits to the data from madVR to convert 10 to 12-bit. This is lossless.
__________________
madVR options explained

Last edited by Asmodian; 11th October 2017 at 17:03.
Asmodian is offline   Reply With Quote
Old 11th October 2017, 17:05   #46463  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 652
Quote:
Originally Posted by Asmodian View Post
Your GPU adds two 0 bits to the data from madVR to convert 10 to 12-bit. This is lossless.
Nice info. This has been a doubt of mine for quite some time. Is it an officially sourced info?
ashlar42 is offline   Reply With Quote
Old 11th October 2017, 17:15   #46464  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
No one knows for sure until measured, but it'd be absolutely stupid if it wasn't lossless since 10 bit is a subset of 12 bit -> no 10 bit result needs to be changed in order to be portrayed by 12 bit.
aufkrawall is offline   Reply With Quote
Old 11th October 2017, 17:55   #46465  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 315
Can anyone say if the default CPU queue is 16 and GPU 8 ? this is the best way to put it ?. always 2x of each other ? but what happens if we want 128 cpu queue and gpu can only be 24 .
x7007 is offline   Reply With Quote
Old 11th October 2017, 18:05   #46466  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
leave them at default except you have real problems.
huhn is offline   Reply With Quote
Old 11th October 2017, 18:18   #46467  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by x7007 View Post
Can anyone say if the default CPU queue is 16 and GPU 8 ? this is the best way to put it ?. always 2x of each other ? but what happens if we want 128 cpu queue and gpu can only be 24 .
CPU a bit higher than GPU, 2x is not important at all.
__________________
madVR options explained

Last edited by Asmodian; 11th October 2017 at 18:27.
Asmodian is offline   Reply With Quote
Old 11th October 2017, 18:26   #46468  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Is there an official statment from nvidia to confirm this 10 / 12 bit handling as I don't think I have ever seen it confirmed anywhere?

It is really annoying that when a TV supports both 10 bit and 12bit, the nvidia control panel will not give 10bit as an option.

Given madVR is now adding custom resolutions, does anyone know if there is some way madVR can hack/fool nvidia drivers to force 10bit output over HDMI? Maybe this could be done by somehow hiding the fact the TV supports 12bit?

I guess a developer may have a better chance of getting info from nvidia than the average joe.

Last edited by Razoola; 11th October 2017 at 18:34.
Razoola is offline   Reply With Quote
Old 11th October 2017, 18:48   #46469  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,340
Quote:
Originally Posted by Razoola View Post
Is there an official statment from nvidia to confirm this 10 / 12 bit handling as I don't think I have ever seen it confirmed anywhere?

It is really annoying that when a TV supports both 10 bit and 12bit, the nvidia control panel will not give 10bit as an option.

Given madVR is now adding custom resolutions, does anyone know if there is some way madVR can hack/fool nvidia drivers to force 10bit output over HDMI? Maybe this could be done by somehow hiding the fact the TV supports 12bit?

I guess a developer may have a better chance of getting info from nvidia than the average joe.
Its not something worth bothering about, though.

With bitdepth its always quite simple - as long as your data fits into it, you don't need to ever worry - that goes for both Audio and Video. 10-bit fits into 12-bit, so its all fine.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 11th October 2017, 19:29   #46470  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Razoola View Post
Is there an official statment from nvidia to confirm this 10 / 12 bit handling as I don't think I have ever seen it confirmed anywhere?
We really do not need this "confirmed" there is only one way to convert 10 bit to 12 bit and it is very straight forward.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 11th October 2017, 20:13   #46471  |  Link
Razoola
Registered User
 
Join Date: May 2007
Posts: 454
Quote:
Originally Posted by Asmodian View Post
We really do not need this "confirmed" there is only one way to convert 10 bit to 12 bit and it is very straight forward.
You would think so but you never know given the way nvidia work and some of the bug in their drivers they never bother to fix. It would be really nice to be able to force a 10bit mode also and then compare side by side against 12bit to be sure.
Razoola is offline   Reply With Quote
Old 11th October 2017, 21:07   #46472  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
It seems GeForce cards support 10-bit on DisplayPort. Buy yourself a 10-bit monitor and test.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 11th October 2017, 21:55   #46473  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Hi guyz.

Just borrowed a gpu that can playback 4K remuxes.

Now.. The color conversion..


Is there a way to set this to a reference setting ?

Because what's happening is, I'm playing with the nits setting in hdr to sdr, along with Luminance compression On vs Off, and I'm really not sure how to eyeball this,

I can only compare it to the 1080p remuxes i have of the same movie, but the difference between mastering makes the task kind of a confounded.


How are you guys setting the hdr to sdr , just eyeballing it?


So far with Luminace compression OFF and nits set at 270, seems as close to the 1080p remux as I can get using these settings alone.
__________________
Ghetto | 2500k 5Ghz

Last edited by tp4tissue; 11th October 2017 at 22:06.
tp4tissue is offline   Reply With Quote
Old 11th October 2017, 22:45   #46474  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
You do not want to match the sdr masters, they are simply too different.

I used to use the madVR defaults with the nits set by eye. Deciding how to calibrate for HDR is still tricky and mastering doesn’t seem to match what standards we have.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 12th October 2017, 00:19   #46475  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
OMG, santa-madshi is back at it and he's quite a bit early too

been running tests with the new settings on 1/4 DVD and 1/4 1080p noisy footage @1080p and so far:
-random noise doesn't help at all
-compression artifacts does miracles but 1 isn't quite enough and 2 is too soft so I'd appreciate more granularity here please
-compression artifacts chroma doesn't improve picture clarity when using quad NGU high + both chroma & luma SR + SSIM 2D 100% LL AB25% on my 4:2:2 Sammy TV
-because those new settings are upfront in the "processing" tree we can't quite make profiles for them apparently, hopefully I can leave "compression artifacts" at 1 and call it a day?

leeperry is offline   Reply With Quote
Old 12th October 2017, 00:36   #46476  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by Asmodian View Post
You do not want to match the sdr masters, they are simply too different.

I used to use the madVR defaults with the nits set by eye. Deciding how to calibrate for HDR is still tricky and mastering doesn’t seem to match what standards we have.

I think compressing the highlights might be the wrong approach..

Because that's like saying, if you had a high pitch noise OUTSIDE of what your speaker can reproduce, you'd Bring that data and playback as the highest frequency the speaker CAN produce..


Everything should just be truncated.. if it's too dark or too bright , just cut it off at 0 or 255..
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 12th October 2017, 01:17   #46477  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Highlights in HDR are not “too bright” they are the change in brightness inside very bright objects. The ability to keep detail in deep shadows and very bright lights is one of the benefits of HDR and I prefer to try and keep as much of that detail as possible.

Flat white for everything bright looks like a bad camera work where the highlights have been blown out. Just watch the SDR master if you don’t want the extra detail in deep shadows or highlights, it will look better.

The frequency analogy doesn’t work, would you want all sounds from higher frequencies played at your max frequency? Clamping a video signal does not have the same visual impact as discarding high frequencies. Volume would be a better analogy and dynamic range compression sounds better with sigmoidal compression, not simply clamping everything above or below a certain volume level.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 12th October 2017, 01:41   #46478  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
Quote:
Originally Posted by Asmodian View Post
Highlights in HDR are not “too bright” they are the change in brightness inside very bright objects. The ability to keep detail in deep shadows and very bright lights is one of the benefits of HDR and I prefer to try and keep as much of that detail as possible.

Flat white for everything bright looks like a bad camera work where the highlights have been blown out. Just watch the SDR master if you don’t want the extra detail in deep shadows or highlights, it will look better.

The frequency analogy doesn’t work, would you want all sounds from higher frequencies played at your max frequency? Clamping a video signal does not have the same visual impact as discarding high frequencies. Volume would be a better analogy and dynamic range compression sounds better with sigmoidal compression, not simply clamping everything above or below a certain volume level.


Ur right.. I don't know what I'm talking about..

Without compression, it just blows up
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 12th October 2017, 01:54   #46479  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Quote:
Originally Posted by leeperry View Post
hopefully I can leave "compression artifacts" at 1 and call it a day?
Yeah, we're all pretty keen on that, once the levels get tweaked I think we'll be golden, it'll hopefully it'll end up being one of those boxes you tick first.
ryrynz is offline   Reply With Quote
Old 12th October 2017, 04:34   #46480  |  Link
leeperry
Kid for Today
 
Join Date: Aug 2004
Posts: 3,477
Quote:
Originally Posted by ryrynz View Post
Yeah, we're all pretty keen on that, once the levels get tweaked I think we'll be golden
I beg to differ as after more thorough testing even 1 appears to smooth out heavy cinema grain, all this said more granularity to go 0.x / 1.x and the ability to mess with it in profiles would be full of win
leeperry is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:12.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.