Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 28th August 2017, 15:16   #341  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by nevcairiel View Post
That isn't an inherent property of DXVA scaling or anything, its just an artifact if you try to access the 4:2:0 "untouched", because there is no 4:2:0 texture format in D3D9, but when you use DXVA to upscale chroma (or the entire image) anyway, then this doesn't really apply. Its not like the chroma is blurred first and then upscaled, thats not how it works. The method madVR uses to try to extract 4:2:0 subsampled chroma just causes blurring.
That explains why DXVA2 chroma scaling "fixes" the blur with native DXVA2 decoding in my cartoon example.
It's cleary visible that it totally disppears.
aufkrawall is offline   Reply With Quote
Old 28th August 2017, 17:54   #342  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Manni View Post
Correction: I was still under 385.28. I updated to 385.41 and I now have the same behaviour (can't select 12 bits from custom mode).
Reverting to 385.28
Ack! That doesn't seem like something Nvidia would change on purpose but it would also be a strange bug to randomly appear.

Thanks for the update.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 28th August 2017, 19:22   #343  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by nevcairiel View Post
That isn't an inherent property of DXVA scaling or anything, its just an artifact if you try to access the 4:2:0 "untouched", because there is no 4:2:0 texture format in D3D9, but when you use DXVA to upscale chroma (or the entire image) anyway, then this doesn't really apply. Its not like the chroma is blurred first and then upscaled, thats not how it works. The method madVR uses to try to extract 4:2:0 subsampled chroma just causes blurring.
well the image you see there is pure DXVA nothing elsed and the OSD isn't indicating that madVR is accessing the 4:2:0 image.
using DXVA native and no DXVa chroma is a whole different story.

Quote:
Provide a test sample please.
take your screen deband it and the banding is gone.

if you want to see banding from DXVA image scaling take the BW.avi output it as 4:2:0 and see for yourself.

but the most important part i'm not even sure if DXVA scaling is any faster than lanczos 3 the way madVR is handling it.
huhn is offline   Reply With Quote
Old 28th August 2017, 19:24   #344  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by huhn View Post
take your screen deband it and the banding is gone.
There is no banding for the 10 bit stripes without deband, and neither there is any for them when you convert the video to 8 bit.
aufkrawall is offline   Reply With Quote
Old 28th August 2017, 19:36   #345  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
"you should see 1024 vertical bar if your display supports 10 bit"

so there are not 1024 steps of banding in there (i can see them on my TV and they say they are there but...) but that's not banding because what again?

and if you really think 1024 steps dither to 8 bit is enough to hide them well go ahead and set madVR to a 10 bit internal processing.
huhn is offline   Reply With Quote
Old 28th August 2017, 19:53   #346  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Quote:
Originally Posted by huhn View Post
"you should see 1024 vertical bar if your display supports 10 bit"

so there are not 1024 steps of banding in there (i can see them on my TV and they say they are there but...) but that's not banding because what again?
Well, you initially mentioned a "BW.avi". Where can I find this sample if it's any better?
You said that DXVA scaling would show banding, so where do you see it?

The 1024 step video is totally enough to make EVR or Win 10 video app show banding, which is not there with madVR + DXVA.
aufkrawall is offline   Reply With Quote
Old 28th August 2017, 20:18   #347  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
there is clear 1024 steps of banding on this file. i can see the banding on your screenshoots. 16 bit dither to 8 bit is clearly better than 16 bit rounded to 10 bit.

the bw.avi is a file that allows you to see the used scaler.
EVR normal does a better job than madVR with DXVA scaling so that's important at least on my system: https://abload.de/img/bw04une.png
and it is bilinear right now (385.41) which is odd.

here you go: http://filehorst.de/d/cmdBtJyd
but not sure why you want it because it is not a real video.
huhn is offline   Reply With Quote
Old 28th August 2017, 21:16   #348  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Can you tell me where do you spot banding here (deband off)?

Software decoding + bilinear:


DXVA2 native + DXVA2 scaling:




The only difference I see is that bilinear is softer here, so the grain in the gradients is harder with dxva2. But I don't notice additional banding.

-> Still not noticing any "real world" problem.
aufkrawall is offline   Reply With Quote
Old 29th August 2017, 11:50   #349  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
there is no real banding in this example.
but you are downscaling a 10 bit source it's harder to add banding there.

nvidia 384.41:

bicubic: https://abload.de/img/bicubic6034rgk.png
DXVA: https://abload.de/img/dxva1yu99.png

the last thing i have to say about the DXVA nvidia scaler avoid it if possible...
huhn is offline   Reply With Quote
Old 29th August 2017, 15:00   #350  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Well, it looks like softcubic. At least I don't see quality issues which would prevent me from using it over bilinear when using a lowend/integrated GPU.
aufkrawall is offline   Reply With Quote
Old 29th August 2017, 18:14   #351  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
it is doing the chroma scaling wrong and moving the chroma by half a subpixel to the right.

just use EVR CP in this case.
huhn is offline   Reply With Quote
Old 29th August 2017, 18:29   #352  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Now that you say it, it indeed looks shifted and can look stupid with black lines.
aufkrawall is offline   Reply With Quote
Old 1st September 2017, 09:07   #353  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Asmodian View Post
It definitely dithers to display shadows, like a plasma, you can see it quite easily.

Banding was a huge problem when setting it up but the banding does not mean it is not dithering, only that it is bad at it.
I don't think what I was seeing was actually dithering to display shadows.

It looks like it is probably just very uneven brightness between pixels when at low brightness.
flat 5% gray

However, I still don't think the banding is due to a lack of dithering. I think it is due to a much worse problem with this OLED TV not having a smooth response, e.g. there are visible steps between some 10 bit values but not between others or when zoomed in all the way on a luminance graph it would look like a slanted stair step (>90° "stairs"). Also it is not uniform spatially, on a solid 5% gray screen there are some obvious lines, squares, etc.

Sorry, not madVR related.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 1st September 2017, 19:58   #354  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the sony, Phillips and the panasonic OLED have no real problem with banding so the issue comes down to low bit deep processing with dithering or high bit deep processing with out dithering.

don't forget an OLED needs heavy processing because it is WRGB.
huhn is offline   Reply With Quote
Old 1st September 2017, 20:55   #355  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Yes, I am simply not sure it is a problem with dithering (or lack thereof), it may be simply bad processing. My expectation is that it would be better if the banding was only because it didn't dither. The visible banding is not evenly spaced in a gradient, it might have obvious banding at only a few transitions, which are randomly spaced and might involve more than one 10-bit value. The number and placement of the bands does change when changing settings like contrast or black level.

I think each pixel/panel probably responds a little differently and needs to be tuned to the right gamma curve and this tuning isn't complete enough, maybe they only do it at 100%, 70% and 40% and assume the panel is smooth between them when it isn't. You can see it when measuring the gamma curve in very finely spaced steps, slight under/overshoots as you move up the curve (~1.2 dE2000 at most but still visible when looking at what should be a smooth gradient). If 73% is undershooting luminance by 1dE and 77% is overshooting by 1dE you get visible banding even if 75% is nearly perfect. This seems to be worse with some colors too, an orange gradient's bands are more obvious but seem to have the same cause.

I remember reading about the process of evening out pixel variations in the processing/display controller, LG was proud they had figured it out and it had a specific term which I cannot remember.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 3rd September 2017, 13:59   #356  |  Link
XMonarchY
Guest
 
Posts: n/a
What is Compatibility in Display Mode Optimization section? My TV works even with very low compatibility rating after optimization. Does low compatibility mean worse image quality?

Also you stated that custom modes prevent 10bit depth unless 10-12bit is already enabled in NVidia CP, but doesn't madVR use some additional colors when 10bit is selected in madVR's Configurator?
  Reply With Quote
Old 3rd September 2017, 16:55   #357  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Low compatibility means it may not work with many displays.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 3rd September 2017, 19:34   #358  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by XMonarchY View Post
Also you stated that custom modes prevent 10bit depth unless 10-12bit is already enabled in NVidia CP, but doesn't madVR use some additional colors when 10bit is selected in madVR's Configurator?
I am not sure what you are asking. 10-bit does allow madVR to send additional colors to the GPU but if the GPU is in an 8-bit mode the GPU drivers will convert madVR's 10-bit output. This is lower quality than simply having madVR output 8-bit in the first place.

Also these additional colors are inside the gamut, 100% RGB is the same color in 8 or 10-bit, 10-bit simply allows finer steps inside the same color gamut.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 4th September 2017, 15:18   #359  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by Asmodian View Post
I am not sure what you are asking. 10-bit does allow madVR to send additional colors to the GPU but if the GPU is in an 8-bit mode the GPU drivers will convert madVR's 10-bit output. This is lower quality than simply having madVR output 8-bit in the first place.

Also these additional colors are inside the gamut, 100% RGB is the same color in 8 or 10-bit, 10-bit simply allows finer steps inside the same color gamut.
What is likely to provide better image quality on my LCD TV:
A.
- Display set to 12bit (10bit functionality) in NVidia CP
- Default/Non-Custom Display mode is used
- madVR is set to 10bit

OR

B.
- Display set to 12bit (10bit functionality) in NVidia CP
- Custom Display mode is used
- madVR is set to 10bit

?

But I guess in the end it doesn't matter that much since madVR ED is really good.
  Reply With Quote
Old 4th September 2017, 20:18   #360  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
It is hard to know without knowing which TV you are using. In your two tests all you are changing is whether or not you are using a custom display mode. B is better in this case.

The important point is what the TV does with 10-bit input. If it does internal processing at <10-bit sending 10-bit from madVR is usually worse, if your display does not handle 10-bit input well it might be better to have madVR dither to 8-bit because it is so much better at dithering. This is also an issue if your GPU is sending 8-bit to your display while madVR is sending 10-bit to the GPU, the GPU has to do the final dithering to 8-bit and the GPU isn't as good at it so it would be better to send it 8-bit. So make sure everything is really using 10-bit properly before setting it anywhere.

Edit: I use these gradients to test with, does madVR 10-bit with dithering look better than without? If not I would set madVR to 8-bit. Never use madVR with the dithering off, except when testing.
gradient-perceptual-colored-v2.1 24fps.mkv
gradient-perceptual-v2.1 24fps.mkv
__________________
madVR options explained

Last edited by Asmodian; 5th September 2017 at 01:55.
Asmodian is offline   Reply With Quote
Reply

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 21:04.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.