Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 21st March 2018, 13:49   #49661  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 93
Hi all,

since I am using a JVC 7900 I am interested in the dynamic HDR of madVR.
My question is how much GPU power a system for madVR would need if I just go for dynamic HDR.

I am struggeling to decide if I should wait for the new upcoming NUC Hades Canyon or start a HTPC from scratch.
I know with a HTPC from scratch I could go for all optimizations in madvr but the system would be louder and more expensive.
The new NUC is reported to have a similiar performance as a nvidia 1060 and if it could handle dynamic HDR my requirement would be met.

NoTechi
NoTechi is offline   Reply With Quote
Old 21st March 2018, 15:19   #49662  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by NoTechi View Post
Hi all,

since I am using a JVC 7900 I am interested in the dynamic HDR of madVR.
My question is how much GPU power a system for madVR would need if I just go for dynamic HDR.

I am struggeling to decide if I should wait for the new upcoming NUC Hades Canyon or start a HTPC from scratch.
I know with a HTPC from scratch I could go for all optimizations in madvr but the system would be louder and more expensive.
The new NUC is reported to have a similiar performance as a nvidia 1060 and if it could handle dynamic HDR my requirement would be met.

NoTechi
Intel doesn't support dynamic HDR switching (yet) in madVR. You need a Nvidia or AMD GPU. Otherwise, you will have to manually toggle the HDR switch when watching HDR files.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 16:42   #49663  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 93
Quote:
Originally Posted by Warner306 View Post
Intel doesn't support dynamic HDR switching (yet) in madVR. You need a Nvidia or AMD GPU. Otherwise, you will have to manually toggle the HDR switch when watching HDR files.
Warner,

the upcoming NUC Hades Canyon has a AMD GPU (Radeon RX Vega M GH) paired with a Intel CPU.
The 7900 projector switches to the HDR preset as Long as the HDR flag is set.
However my question was in regards to GPU power required for dynamic HDR while watching a movie. My understanding was that madVR analysis the currently playing video and adjusts e.g. gamma to get the best HDR settings depending on the movie scene.

NoTechi
NoTechi is offline   Reply With Quote
Old 21st March 2018, 16:50   #49664  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Hi Manni. Thank you. TBH, this whole 4K60p was just a test bed and I've learned plenty from you guys as always. I've yet to run across a real world example of 4K60p other than those test files as my rips are 4K23p but in the event they manifest, this knowledge is good to better understand how to deal with them and if it's ok with you I'll hit you up for profile codes when/if applicable. Actually I think there was that Billy Lynn title but I don't own it. I assume these profiles for dithering are based on resolution and would be created in Display connected to AVR tab?

In the mean time I assume leaving madVR setting to dither at 10bit or higher for 4K23p including (540p through 1080p 8bit at resolution from 23Hz to 60Hz) etc. when using RGB 4:4:4 remains the correct config or should I be using additional profiles since 1080p etc. are 8bit? In short, leave madVR at 10bit or higher for everything except 4k above 30Hz? Furthermore, I'm thinking the GPU is not doing dithering ahead of madVR and madVR will dither them down in these examples? Still foggy in this area. Btw, did new driver present any problems? Considering what I'm learning recently, no reason I shouldn't be using one of the newer drivers if not the latest.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 23H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 21 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 21st March 2018, 17:16   #49665  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by brazen1 View Post
In short, leave madVR at 10bit or higher for everything except 4k above 30Hz?
That.

I haven't spent enough time with the drivers to note any new issues. I only was able to confirm that they were working fine here after checking the usual possible issues.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 21st March 2018, 18:24   #49666  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
Hi Manni. Thank you. TBH, this whole 4K60p was just a test bed and I've learned plenty from you guys as always. I've yet to run across a real world example of 4K60p other than those test files as my rips are 4K23p but in the event they manifest, this knowledge is good to better understand how to deal with them and if it's ok with you I'll hit you up for profile codes when/if applicable. Actually I think there was that Billy Lynn title but I don't own it. I assume these profiles for dithering are based on resolution and would be created in Display connected to AVR tab?

In the mean time I assume leaving madVR setting to dither at 10bit or higher for 4K23p including (540p through 1080p 8bit at resolution from 23Hz to 60Hz) etc. when using RGB 4:4:4 remains the correct config or should I be using additional profiles since 1080p etc. are 8bit? In short, leave madVR at 10bit or higher for everything except 4k above 30Hz? Furthermore, I'm thinking the GPU is not doing dithering ahead of madVR and madVR will dither them down in these examples? Still foggy in this area. Btw, did new driver present any problems? Considering what I'm learning recently, no reason I shouldn't be using one of the newer drivers if not the latest.
Remember, madVR processing starts at a bit depth higher than 10-bits to avoid color conversion errors. It simply dithers the result down to the output bit depth set. So this choice is irrelevant, as you will not be changing the color space, just the number of steps between each color. Outputting an 8-bit source at 10-bits means less dithering is added creating less noise in the image. The gradient gets smoother as the bit depth is increased. So think of the image in terms of a gradient with fixed top and bottom values. The choice of bit depth impacts what is in between the top and bottom values. More steps = a smoother, less noisy image.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 18:32   #49667  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by NoTechi View Post
Warner,

the upcoming NUC Hades Canyon has a AMD GPU (Radeon RX Vega M GH) paired with a Intel CPU.
The 7900 projector switches to the HDR preset as Long as the HDR flag is set.
However my question was in regards to GPU power required for dynamic HDR while watching a movie. My understanding was that madVR analysis the currently playing video and adjusts e.g. gamma to get the best HDR settings depending on the movie scene.

NoTechi
HDR -> SDR conversion requires GPU power. HDR passthrough does not (or maybe it takes a little; I don't know. But not as much). A setting of passthrough lets the display decide how the content is mapped rather than madVR. So you can't use madVR to improve HDR presentation on a HDR-compatible display. That is up to the format used and the quality of the display.

So any GPU with at least 4GB of VRAM and HEVC decoding will do. GPUs with greater power will be more capable of using madVR processing features such as artifact removal and image upscaling. madVR is very good at the image upscaling of 1080p Blu-rays to 4K. So consider this feature when buying a GPU for madVR. A GTX 1060, at minimum, is required to push madVR to higher settings. But a 1050 Ti will allow for basic madVR settings and no limitations on features. It all depends on how much money you want to spend. GPU prices are terrible right now. So there is no hurry to upgrade to 4K.

Last edited by Warner306; 21st March 2018 at 18:36.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:07   #49668  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Thanks Warner for the further details...
I'm sorry guys. I just can't get my head wrapped around all of this. Here's what I'm struggling to understand:

Installed new driver. RGB 4:4:4 and set it to my native 2160p 8bit 60Hz. Then I switched to 2160p 12bit 23Hz and 24Hz. Then set back to 2160p 8bit 60Hz. Next I played a 2160p 23Hz HDR 10bit title no FSE. Looked in NCP during playback and it is showing 8bit at 23Hz as if it ignored my previous command to play 23Hz at 12bit. My display does not show detailed info so I check info from my Denon AVR. It shows RGB 4:4:4 8bit. To me, I don't think this is correct and why I ask you guys. So, during playback I select 12bit in the NCP. I go back to info from AVR and it shows RGB 4:4:4 12bit now. I know title is 10bit so AVR info means nothing I guess? True? Either does bit set depth setting in NCP? True? And madVR does not report anything beyond what the GPU is sending it? True? So how do I know if my display is outputting 8bit or taking advantage of the higher 10bit depth of an HDR title? Sorry I am so naïve!

To make understanding more difficult, after reboot that 12bit setting no longer appears in NCP or my AVR even though I manually changed during playback before I rebooted. It's back to 8bit as if I never set it.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 23H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 21 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 21st March 2018 at 19:20.
brazen1 is offline   Reply With Quote
Old 21st March 2018, 19:09   #49669  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 93
Quote:
Originally Posted by Warner306 View Post
HDR -> SDR conversion requires GPU power. HDR passthrough does not (or maybe it takes a little; I don't know. But not as much). A setting of passthrough lets the display decide how the content is mapped rather than madVR. So you can't use madVR to improve HDR presentation on a HDR-compatible display. That is up to the format used and the quality of the display.

So any GPU with at least 4GB of VRAM and HEVC decoding will do. GPUs with greater power will be more capable of using madVR processing features such as artifact removal and image upscaling. madVR is very good at the image upscaling of 1080p Blu-rays to 4K. So consider this feature when buying a GPU for madVR. A GTX 1060, at minimum, is required to push madVR to higher settings. But a 1050 Ti will allow for basic madVR settings and no limitations on features. It all depends on how much money you want to spend. GPU prices are terrible right now. So there is no hurry to upgrade to 4K.
Warner many thanks I start to understand now!

So it looks like those who are using a projector and are playing HDR content they convert it to SDR BUT let madVR improve the picture including dynamic improvements depending on the movie scene. Most likely they remove the HDR flag so the projector stays in some non HDR but BT2020 setting instead of auto switching to HDR. Especially on projectors with limited lumen output compared to TVs this "fake HDR/pimped SDR" might bring better HDR like results then having HDR passthrough or not using madVR at all.
As I get you right this conversion to SDR plus madVR improvements will need lots of power where this new NUC might come to its limits.

Thanks for clarification Warner and yes gpu prices are insane atm :/

NoTechi
NoTechi is offline   Reply With Quote
Old 21st March 2018, 19:17   #49670  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by NoTechi View Post
Warner many thanks I start to understand now!

So it looks like those who are using a projector and are playing HDR content they convert it to SDR BUT let madVR improve the picture including dynamic improvements depending on the movie scene. Most likely they remove the HDR flag so the projector stays in some non HDR but BT2020 setting instead of auto switching to HDR. Especially on projectors with limited lumen output compared to TVs this "fake HDR/pimped SDR" might bring better HDR like results then having HDR passthrough or not using madVR at all.
As I get you right this conversion to SDR plus madVR improvements will need lots of power where this new NUC might come to its limits.

Thanks for clarification Warner and yes gpu prices are insane atm :/

NoTechi
I think passthrough is the higher-quality method. Your display knows itself best, so it should be calibrated to maximize HDR content. Every display is designed to map using its own methods. It is not a universal algorithm.

Last edited by Warner306; 21st March 2018 at 19:35.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:31   #49671  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
Thanks Warner for the further details...
I'm sorry guys. I just can't get my head wrapped around all of this. Here's what I'm struggling to understand:

Installed new driver. RGB 4:4:4 and set it to my native 2160p 8bit 60Hz. Then I switched to 2160p 12bit 23Hz and 24Hz. Then set back to 2160p 8bit 60Hz. Next I played a 2160p 23Hz HDR 10bit title no FSE. Looked in NCP during playback and it is showing 8bit at 23Hz as if it ignored my previous command to play 23Hz at 12bit. My display does not show detailed info so I check info from my Denon AVR. It shows RGB 4:4:4 8bit. To me, I don't think this is correct and why I ask you guys. So, during playback I select 12bit in the NCP. I go back to info from AVR and it shows RGB 4:4:4 12bit now. I know title is 10bit so AVR info means nothing I guess? True? Either does bit set depth setting in NCP? True? And madVR does not report anything beyond what the GPU is sending it? True? So how do I know if my display is outputting 8bit or taking advantage of the higher 10bit depth of an HDR title? Sorry I am so naïve!

To make understanding more difficult, after reboot that 12bit setting no longer appears in NCP or my AVR even though I manually changed during playback before I rebooted. It's back to 8bit as if I never set it.
I'm not technical enough to answer all of your questions, but I can start. The first scenario where your AVR is reporting 8-bit sounds like a driver error if you selected 12-bit in the NCP. This would be confirmed by the fact you were able to correct this during playback by changing the bit depth in the NCP. Did this change stick?

Second, you are not taking advantage of the 10-bits of the source. It could be output at 8-bits with dithering without most users noticining much of a difference. The color space is not clipped. It is all about smoothing gradients, and high-quality dithering makes various bit depths look smooth. But, of course, you want 10-bit output if your display can support this. Just remember, madVR is processing everything at very high bit depths (16-bits); higher than the highest output bit depth (10-bits). Errors will not occur when going to any bit depth below madVR's processing.

As far as the GPU output is concerned, I don't know what Nvidia sends to display. I thought it passed-through 10-bit, but it might actually be upconverted to 12-bits. That is beyond my technical acumen.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:36   #49672  |  Link
NoTechi
Registered User
 
Join Date: Mar 2018
Location: Germany
Posts: 93
Quote:
Originally Posted by Warner306 View Post
I think passthrough is the higher-quality method. Your display knows itself best, so it should be calibrated to maximize HDR content.
My projector is calibrated and HDR looks great ... but you know there is always still room for improvement and playing with new techi gadgets is fun as well
There are some discussions going on atm on projector boards where it is discussed which method is best for the HDR effect and using madvr is one of them.

NoTechi
NoTechi is offline   Reply With Quote
Old 21st March 2018, 19:37   #49673  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
And the madVR OSD shows what it receives from the source and what conversions are done by madVR. It is placed between the source and the GPU, so it has no idea what the GPU is doing to the image after it has handed it off.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:38   #49674  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by NoTechi View Post
My projector is calibrated and HDR looks great ... but you know there is always still room for improvement and playing with new techi gadgets is fun as well
There are some discussions going on atm on projector boards where it is discussed which method is best for the HDR effect and using madvr is one of them.

NoTechi
As I said in my edit to the first post, every display is designed to map using its own methods taking into account the limitations of its output. It is not a universal algorithm.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 19:49   #49675  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Yes the change stuck until I rebooted. Yes my display supports 10bit. Using my old driver, these settings would apply when I opened a title first time, every time, and they would apply and stick after a reboot . As I understand it now, none of this sticking matters? By sending 8bit from the GPU to madVR, evidently to prevent the GPU from dithering instead of madVR is correct if I understood replies correctly. I don't understand how madVR can dither 8bit it is receiving up to 10bit when playing 2160p 23Hz HDR 10bit? My AVR can't either nor NCP? What am I not understanding here?
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 23H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 21 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 21st March 2018 at 20:16.
brazen1 is offline   Reply With Quote
Old 21st March 2018, 20:04   #49676  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,890
https://forum.doom9.org/showthread.p...18#post1271418
huhn is offline   Reply With Quote
Old 21st March 2018, 20:09   #49677  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Are we in agreement that newer drivers do not retain a 12bit setting after reboot and reverts to 8bit? If no, has anyone established why it sticks for some and not for others? If yes, are you all playing 10bit sources at 8bit even though your hardware is all 10 bit compatible?

huhn, is there something specific I should concentrate on there? If your simply pointing me to page one, well.......
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 23H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 21 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 21st March 2018 at 20:15.
brazen1 is offline   Reply With Quote
Old 21st March 2018, 20:13   #49678  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Quote:
Originally Posted by brazen1 View Post
Yes the change stuck until I rebooted. Yes my display supports 10bit. Using my old driver, these settings would apply when I opened a title first time, every time, and they would apply and stick after a reboot . As I understand it now, none of this sticking matters? By sending 8bit from the GPU to madVR, evidently to prevent the GPU from dithering instead of madVR is correct if I understood replies correctly. I don't understand how madVR can dither 8bit it is receiving up to 10bit when playing 2160p HDR 10bit? My AVR can't either nor NCP? What am I not understanding here?
I don't know if this is exact or not but...

The source starts as 10-bit. This is great because there is no knowing if the studio used dithering or not and this helps ensure there is no banding in the SOURCE.

madVR takes this information and blows it up to 16-bits. This is all math designed to avoid rounding errors and other mistakes that can lead to inaccurate color values. Then, the result is dithered in the highest-quality possible. So the end result is a 10-bit source upconverted and then downconverted for display.

madVR is designed, in almost every way, to avoid inaccurate color conversions, no matter what it is doing, so it should never introduce banding if the bit depth is 8-bits or higher. This all depends on the quality of the source and whether it had banding to begin with.

Like I said a couple of times now, the color space has fixed top and bottom values. You can manipulate the bit depth all you want without screwing up the colors you started with. You just get more shades of each color when the bit depth is increased; everything in between becomes smoother, not more colorful. This is mitigated in madVR by the use of dithering.

Check out these two images, which show the impact of dithering with a bit depth as low as 2-bits.

Dithering to 2-bits:
2 bit Ordered Dithering
2 bit No Dithering

Pretty impressive?

Last edited by Warner306; 21st March 2018 at 23:39.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 20:16   #49679  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
And, 10-bit RGB > 8-bit RGB > 10-bit YCbCr 4:2:2 > 10-bit YCbCr 4:2:0.

You only want to send 8-bit RGB when HDMI bandwidth is a problem (at 60 Hz), or when the display does not support 10-bits or has trouble display 10-bits without banding.

Last edited by Warner306; 21st March 2018 at 20:18.
Warner306 is offline   Reply With Quote
Old 21st March 2018, 20:18   #49680  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,890
Quote:
Actually YCbCr -> RGB conversion gives us floating point data! And not even HDMI 1.4 can transport that. So we have to convert the data down to some integer bitdepth, e.g. 16bit or 10bit or 8bit.
is this part not clear enough.

and literal nearly every TV supports 12 bit input that doesn't mean they are even 8 bit.

it's very simple, can you easily see a difference between 8 bit madVR output or 10 bit?

yes bother with it. no don't bother with it.
huhn is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 07:09.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.