Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 3rd March 2019, 09:44   #55121  |  Link
sat4all
Registered User
 
Join Date: Apr 2015
Posts: 62
Quote:
Originally Posted by Manni View Post
If the video card is set to 8bits full, you can’t set the bit depth im madVR to 10bits or auto or you’ll get lots of banding. The bit depth is not related to the bit depth of the content, it related to the bit depth of the output. So in your case it should be 8bits.

If you really want/need to use 10bits output, then you need to set the driver to 12bits, which causes its own range of issues.
Hope madshi could split 10bit or higher setting into 10 and 12bits, so we can be sure that the gpu isn't doing anything behind our backs when using 12bits output.
__________________
ZOTAC MAGNUS EN1060K: Win 10 x64 + Kodi DSPlayer x64
LG OLED65C8 / Denon AVR-X3200W / KEF E305+ONKYO SKH-410 / Synology DS2415+ / Logitech Harmony 950

Last edited by sat4all; 3rd March 2019 at 10:26.
sat4all is offline   Reply With Quote
Old 3rd March 2019, 10:26   #55122  |  Link
mkohman
Registered User
 
Join Date: Jun 2018
Posts: 51
Quote:
Originally Posted by huhn View Post
are you 100 % certain of this?

i tested this countless times in the past.
the end device has nothing todo with this if the GPU driver respect the 8 bit setting the banding has to be created at the GPU side.

@mkohman upload the images somewhere else plz and a link to them is enough.
Quote:
Originally Posted by huhn View Post
are you 100 % certain of this?

i tested this countless times in the past.
the end device has nothing todo with this if the GPU driver respect the 8 bit setting the banding has to be created at the GPU side.

@mkohman upload the images somewhere else plz and a link to them is enough.
I will try and explain this the best I can in order to avoid any kind of confusion....


I have a JVC projector and a HTPC, my screen is 2.39:1 format (135"). I only ever really watch 2.39 or 2.4 content on my setup and don't really care about 16:9 movies for the projector as I watch them on my 4K TV.


I use madVR with my HTPC along with my GPU Sapphire Nitro+ RX 580 4GB. I use madVR for tone mapping HDR also.


I overshoot my projector so that the top and bottom is overshooting the screen which leaves me a perfect fit for a 2.39 movie. In doing so, my Windows 10 HTPC desktop which was previously set to 3840 x 2160p 60hz (8bit) was overshooting my screen and I wanted to be able to view my entire desktop within my 2.39:1 screen. In order to do this I have searched the internet and asked on the forums how I can do this. I came across a YouTube video where I was guided to edit a registry file within "regedit" and put a custom resolution of 3840 x 1620p at 60hz which I have done and the result can be seen below:


https://i.imgur.com/COnp0gX.jpg


After doing this regedit, I restarted my HTPC , went into the AMD software and selected 3840 x 1620p at 60hz for my desktop resolution. This was absolutely fine and I was happy with this. Here is what it looks like with this resolution on my 2.39 screen:


https://i.imgur.com/sGexqfi.jpg


After I did this I opened up Kodi DSPlayer. My Kodi DSPlayer GUI has always been in 16:9 format within my 2.39 screen and my screen resolution has always been 3840 x 2160p and I have had no issues. Since changing my desktop resolution to a scope format I then wanted to change the way my Kodi DSPlayer looked on my screen . I wanted the GUI to fill the whole screen instead of just 16:9 format..


In order to do this I learned by experimenting that I had to set Kodi DSPlayer's resolution to the same as my desktop (3840 x 1620p) and when I do this the screen stretches out to fill 2.39 but still doesn't look quite right, however when I enable "use fullscreen window" then the GUI sits perfectly within inside my 2.39 screen.. Please see images below:


https://i.imgur.com/o8xuscc.jpg




https://i.imgur.com/lGYPrtP.jpg




https://i.imgur.com/qfcu8PX.jpg




https://i.imgur.com/VTsA7ZB.jpg




https://i.imgur.com/tihDvP1.jpg



So now the issue I was having is that my movies when played with madVR seemed to be downscaling to 3840 x 1620p and didn't play right.. it was stuttering. In order to avoid this I added screen resolutions (2160p23, 2160p24, 2160p30, 2160p50 and 2160p59) into madVR in the hope that it would force the videos to play in this format and not downscale it to 3840 x 1620p which was my desktop resolution and the resolution I had to set within madVR to get the GUI to display correctly.


In doing the above, the video plays correctly now (I think, please check the the images as the movies is 3840 x 2160 but it is showing 3840 x 1607, 2.39) please can you confirm this is OK? I think this is happening as I am using madVR black bar detection .. Please see pics below:


https://i.imgur.com/TWpllJY.jpg




https://i.imgur.com/KS66lWJ.jpg




Please note that in the above picture is says "D3D11 fullscreen windowed (8bit)" . as my movies are 4K HDR shouldn't it say 10bitHuh Please can you advise me of this. I know this is because I have enabled "use fullscreen window" but I must do this if I want the Kodi GUI to be correct format within my screen aspect ratio.


Please see pictures below, this is what it looks like if I disable the "use fullscreen window" in Kodi .. It shows as D3D11 fullscreen (10bit) which is the correct format of the movie:


https://i.imgur.com/W6LWZ81.jpg


As I need Kodi GUI to be perfectly fit in my screen ratio, I select resolution as same as my desktop which is now 3840 x 1620p and enable "use fullscreen window" in Kodi for it to display correctly without any stretching happening.


My question really is, is there a way to get madVR to play this as a 10bit file or is it no problem if it is displays as (8bit) on the madVR OSD?

It's worth mentioning that I use the JVC masking in order to crop the top and bottom of the screen. the reason I do this is when watching movies with IMAX scenes (ie Interstellar, First Man, MI Fallout etc) I don't want the image to overshoot my screen.


https://i.imgur.com/Ypy6DCQ.jpg



https://i.imgur.com/wYebVVP.jpg



https://i.imgur.com/SZDXQR0.jpg



https://i.imgur.com/Mqoibkh.jpg



Here are some of my Kodi Settings for reference to get the GUI to fill the screen:


https://i.imgur.com/RXLfjTI.jpg


https://i.imgur.com/7v5JUD0.jpg


https://i.imgur.com/NhaDCMA.jpg



Here is some pics of my desktop and GPU settings:



https://i.imgur.com/RbXjQG4.jpg



https://i.imgur.com/PkImZoD.jpg



Just to re cap.. Here is what I need to know :


1 - The fact that is is showing (8bit) in the madVR OSD is this ok? If not then how can I please resolve this?


2 - Is my screen resolutions for desktop and Kodi correct and should I be enabling "use fullscreen window"? If I don't then my Kodi GUI doesn't sit in my 2.39 correctly it looks stretched.


3 - When I view a movie with the settings above it seems to be working ok resolution wise but shows as 3840 x 2160 -> 3840 x 1607 is this ok?


4 - Once I exit the Kodi and return to my Windows Desktop, for some reason my windows desktop selects itself to 3840 x 1620p but instead of 60 hz 8 bit it changes to 24hz 12 bit? Whats the reason for this please?



Sorry for so many questions and the very long post but I really would appreciate if I can get this sorted please and guided if I am doing anything wrong.. Thank you so much and look forward to your help and advice Smile


Forgot to post these for reference:

https://i.imgur.com/nm4Dr5C.jpg

https://i.imgur.com/cjTuCfs.jpg
mkohman is offline   Reply With Quote
Old 3rd March 2019, 11:11   #55123  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by huhn View Post
are you 100 % certain of this?

i tested this countless times in the past.
the end device has nothing todo with this if the GPU driver respect the 8 bit setting the banding has to be created at the GPU side.

@mkohman upload the images somewhere else plz and a link to them is enough.
I'm 100% sure that this is what happens here. You are correct that I shouldn't have made a blanket statement as it might not apply to all GPUs. However what good can come from madVR dithering to 10bits and the GPU dithering behind its back to 8bits?

I never said that the banding was generated by madVR, it clearly has to be generated by the GPU.

Here it's a better option to set bit depth to 8bits in madVR if the GPU is set to 8bits, that way madVR dithers directly to 8bits and the GPU can't me$$ things up. It definitely does otherwise.

Now of course if that causes issues with HDR switching, or if using 12bits in the GPU causes banding on the display, that's another question.

I use pixel shader tonemapping so I have no issue with HDR switching in 8bits. And with bit depth set to 8bits in madVR, I get no banding on the JVC, the way I do if bit depth is set to 10bits or auto.

Not sure what I would do if I couldn't use 12bits due to banding and if I needed HDR switching. But I have enough issues and bugs to work around, this is currently the best compromise in my setup, at least until JVC fixes the forced YCC422 in RGB 12bits on the new models.

Quote:
Originally Posted by sat4all View Post
Hope madshi could split 10bit or higher setting into 10 and 12bits, so we can be sure that the gpu isn't doing anything behind our backs when using 12bits output.
That would be good indeed, though as long as the display supports 12bits natively (as the recent JVCs do) I haven't seen any issues caused by this.

More important would be for madVR to detect the bit depth set in the GPU so that auto would adjust automatically to that. At the moment auto is based on the display EDID, so if the display reports 10bits or more, that's what madVR sends, even if the driver is set to 8bits, which is neither correct nor optimal and can/does result in banding, at least here.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 3rd March 2019 at 11:19.
Manni is offline   Reply With Quote
Old 3rd March 2019, 11:56   #55124  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by sat4all View Post
Hope madshi could split 10bit or higher setting into 10 and 12bits, so we can be sure that the gpu isn't doing anything behind our backs when using 12bits output.
You cannot do that, since Direct3D has no 12-bit surface format, meaning that you cannot pass 12-bit data to the GPU. Its either 10-bit, or some weird unusual 16-bit format, so madVR always uses 10-bit if possible, or 8-bit if needed.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 3rd March 2019, 12:44   #55125  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by nevcairiel View Post
You cannot do that, since Direct3D has no 12-bit surface format, meaning that you cannot pass 12-bit data to the GPU. Its either 10-bit, or some weird unusual 16-bit format, so madVR always uses 10-bit if possible, or 8-bit if needed.
Thanks, makes sense.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 3rd March 2019, 13:27   #55126  |  Link
sat4all
Registered User
 
Join Date: Apr 2015
Posts: 62
Quote:
Originally Posted by nevcairiel View Post
You cannot do that, since Direct3D has no 12-bit surface format, meaning that you cannot pass 12-bit data to the GPU. Its either 10-bit, or some weird unusual 16-bit format, so madVR always uses 10-bit if possible, or 8-bit if needed.
Thanks for clearing that up.
I just don't understand why nvidia is not offering 10bit output, at least they should listen to edid data
__________________
ZOTAC MAGNUS EN1060K: Win 10 x64 + Kodi DSPlayer x64
LG OLED65C8 / Denon AVR-X3200W / KEF E305+ONKYO SKH-410 / Synology DS2415+ / Logitech Harmony 950
sat4all is offline   Reply With Quote
Old 3rd March 2019, 14:15   #55127  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Manni View Post
I'm 100% sure that this is what happens here. You are correct that I shouldn't have made a blanket statement as it might not apply to all GPUs. However what good can come from madVR dithering to 10bits and the GPU dithering behind its back to 8bits?
just to make one thing clear at first. i don't put your banding in question i put the reason for the banding in question.

why do i get even back to this? if you get banding on a new device it should never be the GPU if it didn't do it before and everyone should get the same banding if this would be broken.

first of all i know that nvidia was ignoring the 8 bit setting before and just used what ever they seems fit before. if this is again or still the case we may found the issue already.

there is a simple way to test this to make this more clear (sadly still not 100%) you absolutely force nvidia to do 8 bit by using 60 hz UHD.

if you want something to be fixed you should make as sure as possible what the reason is for that.
to your report you get banding with 12 bit GPU output on the new JVC and banding when the GPU is supposed to dither to 8 bit but you get no banding if you output 8 bit directly. the dithering to 8 bit is an odd one here that shouldn't be there.
Quote:
Here it's a better option to set bit depth to 8bits in madVR if the GPU is set to 8bits, that way madVR dithers directly to 8bits and the GPU can't me$$ things up. It definitely does otherwise.
yes it's a bad setup no question here but that still doesn't change that i tested this countless times and nvidia clearly dithers.
feel free to give me the exact settings you are using and i will check it yet again.
huhn is offline   Reply With Quote
Old 3rd March 2019, 14:31   #55128  |  Link
Siso
Soul Seeker
 
Siso's Avatar
 
Join Date: Sep 2013
Posts: 711
Quote:
Originally Posted by ryrynz View Post
Disagree, I love my SSIM 2D downscaled NGU sharp supersampling upscale. No need for any other image sharpening. Lots more pop to the picture and it looks natural.. Kinda intensive though...but what else, am I gonna do on a 1080 image besides set NGU for chroma.
See how it looks to you.
In my case I'm upscaling 1920 movies to 2560 horizontal resolution.
Monitor is 2560x1080p. Any recommendations for image doubling or image upscaling will be appreciated.

Last edited by Siso; 3rd March 2019 at 16:21.
Siso is offline   Reply With Quote
Old 3rd March 2019, 17:14   #55129  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by huhn View Post
just to make one thing clear at first. i don't put your banding in question i put the reason for the banding in question.

why do i get even back to this? if you get banding on a new device it should never be the GPU if it didn't do it before and everyone should get the same banding if this would be broken.

first of all i know that nvidia was ignoring the 8 bit setting before and just used what ever they seems fit before. if this is again or still the case we may found the issue already.

there is a simple way to test this to make this more clear (sadly still not 100%) you absolutely force nvidia to do 8 bit by using 60 hz UHD.

if you want something to be fixed you should make as sure as possible what the reason is for that.
to your report you get banding with 12 bit GPU output on the new JVC and banding when the GPU is supposed to dither to 8 bit but you get no banding if you output 8 bit directly. the dithering to 8 bit is an odd one here that shouldn't be there.


yes it's a bad setup no question here but that still doesn't change that i tested this countless times and nvidia clearly dithers.
feel free to give me the exact settings you are using and i will check it yet again.
I have already given the settings and explained many times

I couldn't use 8bits on my older JVC because of the magenta bug. It's a JVC/nVidia bug that was only at 4K60p initially, and with later drivers became an issue at all 4K refresh rates. So it's not as if there was no banding before in 8bits and there is now. You could get rid of the bug by not sending HDR metadata. I've never tested 8bits before, simply because I couldn't and I always used 12bits because of this magenta bug.

Second I know that nVidia isn't ignoring the 8bits because there is a bug in the rs2000 that forces YCC422 in RGB 12bits. That's why I use 8bits. I don't get any banding in 12bits, I have no idea where you're getting that from.

What I'm saying is that if you set the bit depth in the GPU to 8bits and set the bit depth in madVR to auto you get banding, because auto is based on the EDID of the display, not on the bit depth selected in the GPU. This banding is because the GPU is dithering to 8bits behind madVR's back.

If I set the bit depth in madVR to 8bits, the banding goes away, because madVR dithers directly to the correct bit depth.

I have no idea if you're seeing something different, and honestly I don't care. I have more than enough of these drivers issues and I'm seriously considering throwing my HTPC away, now that we're going to be about to get madVR's goodness on any source.

I've just found out that what I thought was a bug in JVC's DI implementation is an issue that doesn't happen with a standalone player, but only with a HTPC. I'm still exploring this new issue instead of watching films.

Frankly this is taking way too much of my time. I can't wait to leave all this behind.

I really have had enough.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 3rd March 2019 at 17:38.
Manni is offline   Reply With Quote
Old 3rd March 2019, 18:09   #55130  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Manni View Post
Second I know that nVidia isn't ignoring the 8bits because there is a bug in the rs2000 that forces YCC422 in RGB 12bits. That's why I use 8bits. I don't get any banding in 12bits, I have no idea where you're getting that from.
Quote:
As I said, I recommend to use RGB Full 8bits with the new JVCs at the moment. This has no visible detrimental effect as long as you force 8bits in madVR as well. If you let bit depth set to "auto" or "10bits or more" in madVR, you will get a lot of banding. The nVidia CP should be set to RGB full 8bits, and you can either set the JVC and madVR to 0-255 if you care about having the correct levels in the desktop, or set the levels to 16-235 in the JVC and in madVR if you mostly care about video content. My HTPC is for movie playback only so I prefer to set both to 16-235 to avoid unnecessary conversions, as the content is in video levels and madVR's 3D LUT works in video levels as well. That way it remains in video levels all along. Using auto for levels in the JVC is fine most of the time, but then you're never sure if something isn't as it should. I use auto for colorspace but I set levels to 16-235 manually to see when something is not done as expected.
Quote:
Second I know that nVidia isn't ignoring the 8bits because there is a bug in the rs2000 that forces YCC422 in RGB 12bits. That's why I use 8bits. I don't get any banding in 12bits, I have no idea where you're getting that from.
so you tell me here that nvidias 12 BIT RGB setting isn't applied and at the same time that 8 bit isn't ignored?
Quote:
What I'm saying is that if you set the bit depth in the GPU to 8bits and set the bit depth in madVR to auto you get banding, because auto is based on the EDID of the display, not on the bit depth selected in the GPU. This banding is because the GPU is dithering to 8bits behind madVR's back.
if the nvidia driver is dithering you are not supposed to get banding it doesn't matter if it is doing in in the back of madVR or not. that'S the point you would only get banding when dithering is not applied.

the major difference in madVR dithering to other dithering algorithms isn't the banding it's the noise level.

Quote:
I've just found out that what I thought was a bug in JVC's DI implementation is an issue that doesn't happen with a standalone player, but only with a HTPC. I'm still exploring this new issue instead of watching films.
is the stand alone player sending RGB?
you have a magenta problem on the old ones and the new ones have problems with 12 bit forcing "stuff".
huhn is offline   Reply With Quote
Old 3rd March 2019, 18:23   #55131  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by huhn View Post
so you tell me here that nvidias 12 BIT RGB setting isn't applied and at the same time that 8 bit isn't ignored?
The nvidia 12bits setting is applied. But there is a bug in the JVC that forces YCC422 instead of RGB. Please read what I write, it will help us to not go round in circles.

FYI I have an HD Fury Maestro, there is zero doubt about what's coming out of the GPU HDMI out and into the JVC. As in I know exactly the colourspace, bit depth and refresh rate of the content being sent, even when the JVC reports otherwise.

For example, when the GPU sends YCC422 12bits instead of RGB 12bits, the JVC reports RGB 12bits. But if you select the colorspace manually in the JVC, you can see that it's YCC422, not RGB being sent.

Quote:
Originally Posted by huhn View Post
if the nvidia driver is dithering you are not supposed to get banding it doesn't matter if it is doing in in the back of madVR or not. that'S the point you would only get banding when dithering is not applied.
Ok, whatever. The only thing I can tell you if that if you don't get madVR to do the dithering by setting the bit depth in madVR to 8bits, you get banding. I sincerely don't care what the explanation is. The GPU is sending 8bits, so if it's not dithering the 10bits content sent by madVR, well, it's even worse than I thought. Well done nVidia.

Quote:
Originally Posted by huhn View Post
the major difference in madVR dithering to other dithering algorithms isn't the banding it's the noise level.
Yes, sure.

Quote:
Originally Posted by huhn View Post
is the stand alone player sending RGB?
you have a magenta problem on the old ones and the new ones have problems with 12 bit forcing "stuff".
I tried RGB and YCC in 12bits, I couldn't set it to 8bits as it only allows to set the output to 10, 12bits or auto in 4K.

Yes, I have a magenta problem in 8bits on the old one and an YCC422 forced in 12bits RGB on the new one.

Please tell me something I don't know if you really have to say something

It's taking me long enough to troubleshoot this without having to answer all your questions on top.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 3rd March 2019, 18:38   #55132  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
419.17

FSE 10 bit GPU 8 bit= no banding
FSE 10 bit GPU 12 bit= no banding
WFS 10 bit GPU 8= heavy banding
WFS 10 bit GPU 12 bit= banding

and that's why i ask for the exact settings.

this is a sample size of 1 i have one more PC i will test later.
looks like windows or nvidia broke yet again WFS 10 bit.
huhn is offline   Reply With Quote
Old 3rd March 2019, 20:09   #55133  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
It's weird how everyone's setup is different. That makes it even harder to troubleshoot. For me, there is no difference between WFS and FSE. I don't get banding either way with the GPU set to 8-bit output and madVR to either 8-bit or 10-bit/auto. The only time i get major banding is when I set the GPU to output 12-bit. By the way, "auto" in madVR seems to pick 8-bit or 10-bit randomly (for SDR content), according to the OSD. There is no consistency in my case, but the one thing that hasn't worked in quite a while is what I talked about yesterday: The switching into HDR mode when madVR is set to 8-bit. That's why I wanted to know if anyone had ever gotten the HDR flag to work in profiles. It would solve my one remaining issue
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Old 3rd March 2019, 20:25   #55134  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by huhn View Post
419.17

FSE 10 bit GPU 8 bit= no banding
FSE 10 bit GPU 12 bit= no banding
WFS 10 bit GPU 8= heavy banding
WFS 10 bit GPU 12 bit= banding

and that's why i ask for the exact settings.

this is a sample size of 1 i have one more PC i will test later.
looks like windows or nvidia broke yet again WFS 10 bit.
I never use FSE, the resync time is too long and I don't need it.

You're not giving enough information.

I don't have banding in 8bits or in 12bits in WFS, provided the correct bit depth is selected in madVR.

This is what I get here (419.17, but I'm reverting to 385.28):

WFS 8bits GPU 8bits madVR 8bits= no banding
WFS 10 bit GPU 12 bit madvr 10bits or more (or auto)= no banding
WFS 10 bit GPU 8 bits madvr 10bits or more (or auto)= banding

Anyway, I think everyone gets different results depending on the combination of GPU, driver, display and settings.

That's why I concentrate on my issues, and told you you were correct that I shouldn't make blanket statements.

The only thing I can do is report the issues I experience, and when I find a fix or workaround, share it.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 3rd March 2019, 20:45   #55135  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
VBB,
In devices/your display/, create a profile group for PROPERTIES using this rule:

If (deintFps <= 30) "10-bit"
else "8-bit"

Add a profile for PROPERTIES named 10-bit.
Add another named 8-bit.

For the 10-bit profile under 'the native display bitdepth is: - select 10 bit (or higher).
For the 8-bit profile, select 8bit.

I use older nVidia drivers. Never a problem going into HDR mode and both are set to 8 bit here.

Follow this link and you'll see the screenshots I've listed - https://imgur.com/a/4h7U0#MruR8Lw
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 3rd March 2019 at 20:50.
brazen1 is offline   Reply With Quote
Old 3rd March 2019, 21:04   #55136  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
this wouldn't the first time everyone with nvidia card got banding with WFS 10 bit. the banding with sending 10 bit is hard to see i don't expect many other user to see it the only reason i even think it has anything todo with the GPU/WDM is the fact is gone in FSE. i still have a problem understanding how a GPU could produce it at input 10 bit 12 bit output.

BTW. there is no need to act elitist here no one forces you to answer anything and no one can read your mind.
not using FSE is not a small unimportant thing with the history by nvidia. known all 4 states(GPU driver, madVR OSD, HDfury and end device) for the signal is helping a lot. yes you know this stuff because you are sitting next to it i don't.

i found a potential fix for your banding issue if you don't like it/can't use it well to bad but this should help other users at least.

even the 8 bit banding is pretty hard to see on the 6 bit TN i'm testing right now.
so if someone wants to test: http://www.bealecorner.org/red/test-...ient-16bit.png
huhn is offline   Reply With Quote
Old 3rd March 2019, 21:09   #55137  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
Quote:
Originally Posted by brazen1 View Post
VBB,
In devices/your display/, create a profile group for PROPERTIES using this rule:

If (deintFps <= 30) "10-bit"
else "8-bit"

Add a profile for PROPERTIES named 10-bit.
Add another named 8-bit.

For the 10-bit profile under 'the native display bitdepth is: - select 10 bit (or higher).
For the 8-bit profile, select 8bit.

I use older nVidia drivers. Never a problem going into HDR mode and both are set to 8 bit here.

Follow this link and you'll see the screenshots I've listed - https://imgur.com/a/4h7U0#MruR8Lw
Thanks, but wouldn't that just switch anything with less than or equal to 30fps to 10-bit and the rest to 8-bit, regardless of what bit rate the source actually is?

Maybe I didn't explain this right Ideally, I'd like to have the entire chain set to 8-bit. That's best case scenario with the OLEDs, but it's not doable at the moment, because madVR won't switch into HDR mode when it's set to 8-bit.

As a workaround, I want to split DISPLAY -> PROPERTIES into two separate profiles, one for SDR using 8-bit, and one for HDR using 10-bit/auto. The only boolean I can see useful for this is "HDR", as in

if (HDR) "HDR"
else "SDR"

but that does not work. madVR will always use the SDR profile.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Old 3rd March 2019, 21:47   #55138  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Sorry. I was thinking that you only want HDR to engage using 10 bit since they are all 10 bit.
SDR only for 8 bit but it appears not all SDR are only 8 bit - They are 10 bit also otherwise we could have differentiated.
Disregard.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 3rd March 2019, 21:53   #55139  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
Quote:
Originally Posted by brazen1 View Post
Sorry. I was thinking that you only want HDR to engage using 10 bit since they are all 10 bit.
SDR only for 8 bit but it appears not all SDR are only 8 bit - They are 10 bit also otherwise we could have differentiated.
Disregard.
Well, yes, that is essentially what I want, but I don't think your code would do that. It doesn't contain a trigger for bit rate detected, only for frame rate.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Old 3rd March 2019, 22:00   #55140  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Try "hdr" or "bitDepth."
Warner306 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 01:41.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.