Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
11th October 2017, 16:58 | #46461 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,348
|
Yes. From LAVs side its unfortunately not really possible to know if its a full hardware decoder or a Hybrid decoder, the driver exposes them the same way.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
11th October 2017, 17:00 | #46462 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Quote:
Your GPU adds two 0 bits to the data from madVR to convert 10 to 12-bit. This is lossless.
__________________
madVR options explained Last edited by Asmodian; 11th October 2017 at 17:03. |
|
11th October 2017, 18:18 | #46467 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
CPU a bit higher than GPU, 2x is not important at all.
__________________
madVR options explained Last edited by Asmodian; 11th October 2017 at 18:27. |
11th October 2017, 18:26 | #46468 | Link |
Registered User
Join Date: May 2007
Posts: 454
|
Is there an official statment from nvidia to confirm this 10 / 12 bit handling as I don't think I have ever seen it confirmed anywhere?
It is really annoying that when a TV supports both 10 bit and 12bit, the nvidia control panel will not give 10bit as an option. Given madVR is now adding custom resolutions, does anyone know if there is some way madVR can hack/fool nvidia drivers to force 10bit output over HDMI? Maybe this could be done by somehow hiding the fact the TV supports 12bit? I guess a developer may have a better chance of getting info from nvidia than the average joe. Last edited by Razoola; 11th October 2017 at 18:34. |
11th October 2017, 18:48 | #46469 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,348
|
Quote:
With bitdepth its always quite simple - as long as your data fits into it, you don't need to ever worry - that goes for both Audio and Video. 10-bit fits into 12-bit, so its all fine.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
|
11th October 2017, 19:29 | #46470 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
We really do not need this "confirmed" there is only one way to convert 10 bit to 12 bit and it is very straight forward.
__________________
madVR options explained |
11th October 2017, 20:13 | #46471 | Link |
Registered User
Join Date: May 2007
Posts: 454
|
You would think so but you never know given the way nvidia work and some of the bug in their drivers they never bother to fix. It would be really nice to be able to force a 10bit mode also and then compare side by side against 12bit to be sure.
|
11th October 2017, 21:07 | #46472 | Link |
Registered User
Join Date: Oct 2016
Posts: 896
|
It seems GeForce cards support 10-bit on DisplayPort. Buy yourself a 10-bit monitor and test.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40 |
11th October 2017, 21:55 | #46473 | Link |
Registered User
Join Date: May 2013
Posts: 714
|
Hi guyz.
Just borrowed a gpu that can playback 4K remuxes. Now.. The color conversion.. Is there a way to set this to a reference setting ? Because what's happening is, I'm playing with the nits setting in hdr to sdr, along with Luminance compression On vs Off, and I'm really not sure how to eyeball this, I can only compare it to the 1080p remuxes i have of the same movie, but the difference between mastering makes the task kind of a confounded. How are you guys setting the hdr to sdr , just eyeballing it? So far with Luminace compression OFF and nits set at 270, seems as close to the 1080p remux as I can get using these settings alone.
__________________
Ghetto | 2500k 5Ghz Last edited by tp4tissue; 11th October 2017 at 22:06. |
11th October 2017, 22:45 | #46474 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
You do not want to match the sdr masters, they are simply too different.
I used to use the madVR defaults with the nits set by eye. Deciding how to calibrate for HDR is still tricky and mastering doesn’t seem to match what standards we have.
__________________
madVR options explained |
12th October 2017, 00:19 | #46475 | Link |
Kid for Today
Join Date: Aug 2004
Posts: 3,477
|
OMG, santa-madshi is back at it and he's quite a bit early too
been running tests with the new settings on 1/4 DVD and 1/4 1080p noisy footage @1080p and so far: -random noise doesn't help at all -compression artifacts does miracles but 1 isn't quite enough and 2 is too soft so I'd appreciate more granularity here please -compression artifacts chroma doesn't improve picture clarity when using quad NGU high + both chroma & luma SR + SSIM 2D 100% LL AB25% on my 4:2:2 Sammy TV -because those new settings are upfront in the "processing" tree we can't quite make profiles for them apparently, hopefully I can leave "compression artifacts" at 1 and call it a day? |
12th October 2017, 00:36 | #46476 | Link | |
Registered User
Join Date: May 2013
Posts: 714
|
Quote:
I think compressing the highlights might be the wrong approach.. Because that's like saying, if you had a high pitch noise OUTSIDE of what your speaker can reproduce, you'd Bring that data and playback as the highest frequency the speaker CAN produce.. Everything should just be truncated.. if it's too dark or too bright , just cut it off at 0 or 255..
__________________
Ghetto | 2500k 5Ghz |
|
12th October 2017, 01:17 | #46477 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Highlights in HDR are not “too bright” they are the change in brightness inside very bright objects. The ability to keep detail in deep shadows and very bright lights is one of the benefits of HDR and I prefer to try and keep as much of that detail as possible.
Flat white for everything bright looks like a bad camera work where the highlights have been blown out. Just watch the SDR master if you don’t want the extra detail in deep shadows or highlights, it will look better. The frequency analogy doesn’t work, would you want all sounds from higher frequencies played at your max frequency? Clamping a video signal does not have the same visual impact as discarding high frequencies. Volume would be a better analogy and dynamic range compression sounds better with sigmoidal compression, not simply clamping everything above or below a certain volume level.
__________________
madVR options explained |
12th October 2017, 01:41 | #46478 | Link | |
Registered User
Join Date: May 2013
Posts: 714
|
Quote:
Ur right.. I don't know what I'm talking about.. Without compression, it just blows up
__________________
Ghetto | 2500k 5Ghz |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|