Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 8th April 2018, 14:50   #50141  |  Link
kalston
Registered User
 
Join Date: May 2011
Posts: 164
Quote:
Originally Posted by stefanelli73 View Post
Actually I've tried only 12 bit today I will go to 10bit and 8bit ....... Madshi you have a solution to the problem that I have subtitles? In practice when forced subtitles come out in a scene or when I use subtitles normally the image slows down and the ms of the two renderings increase from 30ms up to 110ms, when the subtitles disappear it returns all right.....my player is JRIVER.
JRiver's subtitle engine is quite demanding for some reason... On my Surface 3 (Atom CPU/IGPU) it makes DVDs unwatchable unless I give up on quality upscaling completely (blu-rays are ok because no upscaling needed other than chroma). MPC's subtitle engine is fine on the other hand, doesn't seem to hurt rendering times.

On my desktop though, no difference between the 2 in rendering times (i7 8700k/1080 ti).
kalston is offline   Reply With Quote
Old 8th April 2018, 17:34   #50142  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Hi madshi. Re: Banding and your 390.65 drivers.
NVidia introduced a pretty significant Windows audio bug starting with 390.65 through the present. I don't think any of these drivers retain a 12bit setting after a reboot either. All drivers prior to 390.65 have various dynamic HDR switching problems except 385.69 and 385.28. Those two and only those two are HDR switching and Audio switching correct. They also retain a 12bit setting after a reboot, not that it matters.

Nvidia does not offer an RGB 10bit setting. Only 8 or 12. So, I always selected 12bit believing it would dither down to 10bit as needed. Then I discovered banding on the UHD HDR title Allied as one example. I presented the example quirk here. After testing by a few members it was determined some displays handle 12bit better than others because they are native 12bit. They get no banding using 12bit. This leads me to believe the banding is not built in to the title. My Samsung native 10bit for example handles it poorly and introduces banding. If NVidia provided a 10bit setting I could simply use that but they don't. So, I'm required to use 8bit. The banding is no longer present. Of course we'd all like to use 10bit for 10bit sources like UHD HDR on our 10bit displays with a perfect chain but we are limited to 8bit until NVidia introduces RGB 10bit. This I don't count on.

Warner quizzed an AMD user about this banding quirk suggesting he set his driver to 10bit since AMD offers RGB at 10bit. He reported no banding using 10 or 12bit. Perhaps his display is 12bit native or he really isn't looking or admitting to banding proud of his new setup? Today a user here reports AMD banding with 12bit although again we don't know if this display is native 10bit causing the banding nor results at a 10bit setting.

The point is, can madVR work around NVidia's fault for not supplying RGB 10bit which affects native 10bit (not 12bit) displays? I wouldn't ask if I thought NVidia would eventually step up but the fact is, they've NEVER offered it afaik. If we could get detailed positive confirmations from AMD users who have no banding using 10bit settings and displays, then the fix could be relatively simple; Dump NVidia and concentrate on AMD or use a 12bit display. Personally, given the price gouging going on, I will continue to use what I have at 8bit and replacing my display is out of the question at this point. I read the eye is not going to discern 8bit vs 10bit anyway but I have no way of concluding that for myself. I will trust that coming from you if indeed that is your opinion. I assume many more colors within the gamut would be visible especially blends which are now bands of ugliness.

We all realize you are concentrating on other things and don't expect anything anytime soon if ever. Just some ammo to consider when and if you find any of this worthy of your powers. Thank you as always for providing what you have. If you never update madVR again, it's still a brilliant software just as it is.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W

Last edited by brazen1; 8th April 2018 at 17:55.
brazen1 is offline   Reply With Quote
Old 8th April 2018, 18:00   #50143  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
i'll test it on my AMD rig, what do I need to look at exactly?, full full chain is 10 bit so I should spot it if its there.
mclingo is offline   Reply With Quote
Old 8th April 2018, 18:55   #50144  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
The only way to understand the difference in bit depths is to find a grayscale test pattern. Set madVR to 8-bits and 10-bits. The gradient should get smoother at 10-bits, but it doesn't. In fact, the effect of dithering at 8-bits can make the gradient look smoother than 10-bits, but with a little more noise. And this noise is difficult to notice. That is all the bit depth does; it makes things smoother, not more colorful.

There are no 12-bit flat panels (maybe projectors), but there are displays which handle 12-bit inputs more gracefully. This would depend on whether dithering is used and the quality of that dithering.
Warner306 is offline   Reply With Quote
Old 8th April 2018, 19:03   #50145  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by stefanelli73 View Post
Madshi you have a solution to the problem that I have subtitles? In practice when forced subtitles come out in a scene or when I use subtitles normally the image slows down and the ms of the two renderings increase from 30ms up to 110ms, when the subtitles disappear it returns all right.....my player is JRIVER.
Quote:
Originally Posted by kalston View Post
JRiver's subtitle engine is quite demanding for some reason... On my Surface 3 (Atom CPU/IGPU) it makes DVDs unwatchable unless I give up on quality upscaling completely (blu-rays are ok because no upscaling needed other than chroma). MPC's subtitle engine is fine on the other hand, doesn't seem to hurt rendering times.

On my desktop though, no difference between the 2 in rendering times (i7 8700k/1080 ti).
It's really the rendering times in madVR which slow down with the JRiver internal subtitle renderer? I'm not sure why that would happen, to be honest. How high is the CPU usage when that occurs?

Quote:
Originally Posted by brazen1 View Post
Nvidia does not offer an RGB 10bit setting. Only 8 or 12. So, I always selected 12bit believing it would dither down to 10bit as needed. Then I discovered banding on the UHD HDR title Allied as one example.
There can be multiple causes of banding. It could be hard coded in the source. It could be a bug in the GPU drivers. It could be a dozen other things.

-------

Just found out that the Nvidia GPU drivers have a bug in Windows 10 which stop 10bit from working properly in fullscreen windowed mode, but only when using HDR passthrough. Works fine without HDR passthrough. Works fine in FSE mode.

The next madVR build will force 8bit in HDR fullscreen windowed mode for Nvidia GPUs. For now, to workaround this issue, either use FSE mode, or manually switch the madVR monitor settings to 8bit.
madshi is offline   Reply With Quote
Old 8th April 2018, 19:37   #50146  |  Link
kalston
Registered User
 
Join Date: May 2011
Posts: 164
Quote:
Originally Posted by madshi View Post
It's really the rendering times in madVR which slow down with the JRiver internal subtitle renderer? I'm not sure why that would happen, to be honest. How high is the CPU usage when that occurs?
Yeah rendering times, I just checked.

On the files I tried (DVD and blu-ray rips I made) I got an extra ~10ms rendering time from enabling subtitles alone (and have to restart the playback to reset it, hiding the subtitles isn't enough). It's hard to measure CPU usage on this laptop though, CPU clock keeps going up and down etc. But without subs, I was at 15-20% CPU usage with very low clocks, enabling subs doesn't seem to increase it by that much, maybe 5-10%.

Probably just a JRiver issue with some specific hardware though (Atom x7-Z8700 in my case). On my desktop enabling subs adds maybe 0.3ms to rendering times, if that...

Last edited by kalston; 9th April 2018 at 00:34.
kalston is offline   Reply With Quote
Old 8th April 2018, 20:43   #50147  |  Link
pankov
Registered User
 
Join Date: Mar 2002
Location: Sofia, Bulgaria
Posts: 661
Quote:
Originally Posted by madshi View Post
P.S: I've only tested this with my SDR test pattern. Maybe things are different in HDR passthrough mode? I don't really know how to test it there, though, because banding problems are easy to see with my test pattern, but harder to see with true HDR content.
A few days ago I found a very good HDR video to test the banding in Mehanik's HDR10 calibration and test patterns set that can be found here
http://www.avsforum.com/forum/139-di...terns-set.html
In the "04. Colors" folder there is a file called "01. Banding_Rotating-gradients_23.976.mp4" and with it I think I found an issue ... or it's technical limitation? ... in madVR's HDR to SDR conversion by using pixel shader math.
It looks terrible in the blue a little bit better in the magenta and almost OK in the green and yellow.
I know that this conversion/process is not lossless but this looks horrible and I simply can't believe that it's by design.

madshi,
a while ago there was a quite detailed/heated discussion how should we configure madVR in regards to the display calibration options. In the past when we had only SDR content it was widely accepted the the best choice would be to select BT.709 and "pure power curve" 2.20 ... or 2.40 depending on the display/viewing conditions ... and then correctly configure/calibrate the display to the same params and let madVR do the needed conversions in the rare cases of SMPTE C/PAL/etc. content.
But with the introduction of HDR and wider color gamut displays (usually not wider than DCI-P3) I'm starting to doubt this rule. Especially having in mind that the TVs/displays usually switch their color mode for HDR content ... and I think that this is correct since we don't want to use the much wider color space but still send data in a subset of it ... and in 8bit ... which might introduce banding.
So what is madVR doing when it's rendering HDR content in BT.2020 container with usually up to DCI-P3 data (99.99% of the HDR content) and I have my display set to "already calibrated" to BT.709/2.40? Is it trying to squeeze the content in the narrow BT.709 space or it presumes that if I want HDR content (in passthrough mode) than my display should accept and be calibrated correctly to BT.2020?
Don't we need a second calibration config for HDR (wide color gamut) content?
... or have I totally lost it and talking nonsense now?

P.S.
Can someone ... or you madshi ... remind me why when accessing madVR's filter properties from the player there is a need to open this small window with madVR's version and the two buttons where the user has to click one more time (and one more to close it)? Isn't it possible to directly open the settings from madHcCtrl.exe as it happens when I press the "Edit Settings" button? If it's some kind of a DirectX requirement to have such a window isn't it possible to just briefly create the window, internally call madHcCtrl and then automatically close the small window?
__________________
Z370M Pro4 | i3-8100 | 16GB RAM | 256GB SSD + 40TB NAS
NVIDIA GTX 1060 6GB (385.28) | LG OLED65B7V
Win 10 64bit 1803 + Zoom Player v14
pankov is offline   Reply With Quote
Old 8th April 2018, 21:58   #50148  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by madshi View Post

Just found out that the Nvidia GPU drivers have a bug in Windows 10 which stop 10bit from working properly in fullscreen windowed mode, but only when using HDR passthrough. Works fine without HDR passthrough. Works fine in FSE mode.

The next madVR build will force 8bit in HDR fullscreen windowed mode for Nvidia GPUs. For now, to workaround this issue, either use FSE mode, or manually switch the madVR monitor settings to 8bit.
Can you elaborate on this bug? Is it there with all driver versions, or only recent ones? I use HDR passthrough in 12bits with Windows 10 and there is no banding that I can see in fullscreen windowed mode (I don't use FSE except in 3D). My projector is native 12bits though, with 12bits support from the inputs to the panels.

I have tested the "Allied" clip mentioned in HDR passthrough, there is zero banding here.

As you know recent drivers seem to have borked levels (not sure when that happened), so I've reverted to 385.28 which is the last driver where everything works (12bits selectable in custom refresh mode, ASIO4All compatibility, and proper levels). The only bug (HTPC-wise) that I'm aware of with this version is in Dolby Atmos speaker config, the SBs and the SRs are inverted. I use 7.1 so I don't mind.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 8th April 2018 at 22:08.
Manni is offline   Reply With Quote
Old 8th April 2018, 22:03   #50149  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Quote:
Originally Posted by pankov View Post
A few days ago I found a very good HDR video to test the banding in Mehanik's HDR10 calibration and test patterns set that can be found here
http://www.avsforum.com/forum/139-di...terns-set.html
In the "04. Colors" folder there is a file called "01. Banding_Rotating-gradients_23.976.mp4" and with it I think I found an issue ... or it's technical limitation? ... in madVR's HDR to SDR conversion by using pixel shader math.
It looks terrible in the blue a little bit better in the magenta and almost OK in the green and yellow.
I know that this conversion/process is not lossless but this looks horrible and I simply can't believe that it's by design.
Just had a look at that video, and it renders much better when unchecking 'preserve hue' in madVR's HDR processing settings.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 8th April 2018, 22:33   #50150  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by kalston View Post
Yeah rendering times, I just checked.

On this DVD I used (Dersu Uzala), I got an extra ~10ms rendering time from enabling subtitles alone (and have to restart the playback to reset it, hiding the subtitles isn't enough). It's hard to measure CPU usage on this laptop though, CPU clock keeps going up and down etc. But without subs, I was at 15-20% CPU usage with very low clocks, enabling subs doesn't seem to increase it by that much, maybe 5-10%.

Probably just a JRiver issue with some specific hardware though (Atom x7-Z8700 in my case). On my desktop enabling subs adds maybe 0.3ms to rendering times, if that...
Strange. I'm not sure exactly how JRiver draws subtitles. I think it uses one of my OSD interfaces, but I'm not sure which one.

Quote:
Originally Posted by Manni View Post
Can you elaborate on this bug? Is it there with all driver versions, or only recent ones?
I don't know, I've only tested the one driver I have currently installed (390.65). It's 100% reproducable for me with this driver. I get banding free 10bit output in SDR, but as soon as I activate Nvidia's HDR passthrough, I get banding when I try to render to 10bit.

Quote:
Originally Posted by pankov View Post
A few days ago I found a very good HDR video to test the banding in Mehanik's HDR10 calibration and test patterns set that can be found here
http://www.avsforum.com/forum/139-di...terns-set.html
In the "04. Colors" folder there is a file called "01. Banding_Rotating-gradients_23.976.mp4" and with it I think I found an issue ... or it's technical limitation? ... in madVR's HDR to SDR conversion by using pixel shader math.
It looks terrible in the blue a little bit better in the magenta and almost OK in the green and yellow.
The official build has a bug which can sometimes introduce a bit of banding when doing HDR -> SDR conversion. That's already fixed in this test build from AVSForum:

http://madshi.net/madVRhdrTest5.rar

This build still shows some weirdness for the blue, but the other colors should be fine. I'm not completely sure why the problem with blue happens, will have to investigate.

Quote:
Originally Posted by pankov View Post
a while ago there was a quite detailed/heated discussion how should we configure madVR in regards to the display calibration options. In the past when we had only SDR content it was widely accepted the the best choice would be to select BT.709 and "pure power curve" 2.20 ... or 2.40 depending on the display/viewing conditions ... and then correctly configure/calibrate the display to the same params and let madVR do the needed conversions in the rare cases of SMPTE C/PAL/etc. content.
But with the introduction of HDR and wider color gamut displays (usually not wider than DCI-P3) I'm starting to doubt this rule. Especially having in mind that the TVs/displays usually switch their color mode for HDR content ... and I think that this is correct since we don't want to use the much wider color space but still send data in a subset of it ... and in 8bit ... which might introduce banding.
I've said it a thousand times. 8bit doesn't introduce banding. Even 4bit doesn't introduce banding. That's what we have dithering for.

HDR calibration is a mess. IMHO the best workaround is probably to let madVR convert HDR to SDR, and then you can just run the converted SDR video through a conventional SDR 3dlut.

Quote:
Originally Posted by pankov View Post
So what is madVR doing when it's rendering HDR content in BT.2020 container with usually up to DCI-P3 data (99.99% of the HDR content) and I have my display set to "already calibrated" to BT.709/2.40? Is it trying to squeeze the content in the narrow BT.709 space or it presumes that if I want HDR content (in passthrough mode) than my display should accept and be calibrated correctly to BT.2020?
That depends. Are we talking about passing HDR through to the display? Or letting madVR convert HDR to SDR?

Quote:
Originally Posted by pankov View Post
Can someone ... or you madshi ... remind me why when accessing madVR's filter properties from the player there is a need to open this small window with madVR's version and the two buttons where the user has to click one more time (and one more to close it)? Isn't it possible to directly open the settings from madHcCtrl.exe as it happens when I press the "Edit Settings" button?
Just double click on the madHcCtrl tray icon.
madshi is offline   Reply With Quote
Old 8th April 2018, 22:51   #50151  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by madshi View Post
I don't know, I've only tested the one driver I have currently installed (390.65). It's 100% reproducable for me with this driver. I get banding free 10bit output in SDR, but as soon as I activate Nvidia's HDR passthrough, I get banding when I try to render to 10bit.
Thanks, I'll double check banding in 10bits with 385.28 when I'm back in a few days.

In the meantime please could you not force 8bits in HDR passthrough, unless/until the bug is confirmed with all driver versions and not just the latest 39x.xx?

Maybe the banding in 10bits was introduced in the driver at the same time they borked the levels?

Also does your display support HDR 12bits natively?

Quite a few people with 10bits displays have been reporting issues using 12bits with nVidia, but this might simply be because the display doesn't support 12bits natively and the banding is introduced by the display itself when using HDR passthrough. When converted to SDR, the path might be different and the banding might not occur.

Only trying to find reasons why it might not be a universal bug, even if you can reproduce 100% with your set-up/display.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 8th April 2018, 22:57   #50152  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by Manni View Post
Thanks, I'll double check banding in 10bits with 385.28 when I'm back in a few days.

In the meantime please could you not force 8bits in HDR passthrough, unless/until the bug is confirmed with all driver versions and not just the latest 39x.xx?

Maybe the banding in 10bits was introduced in the driver at the same time they borked the levels?

Also does your display support HDR 12bits natively?

Quite a few people with 10bits displays have been reporting issues using 12bits with nVidia, but this might simply be because the display doesn't support 12bits natively and the banding is introduced by the display itself when using HDR passthrough. When converted to SDR, the path might be different and the banding might not occur.

Only trying to find reasons why it might not be a universal bug, even if you can reproduce 100% with your set-up/display.
Next build is not coming for at least a week, anyway.

I see banding in 10bit fullscreen windowed mode, but *not* in 10bit fullscreen exclusive mode, which means the display can't be at fault.
madshi is offline   Reply With Quote
Old 8th April 2018, 23:03   #50153  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by madshi View Post
Next build is not coming for at least a week, anyway.

I see banding in 10bit fullscreen windowed mode, but *not* in 10bit fullscreen exclusive mode, which means the display can't be at fault.
Okay, do you have a preferred pattern/clip to test banding? I use many but I'd like to be sure the same test you use is passed/fails here.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is offline   Reply With Quote
Old 8th April 2018, 23:25   #50154  |  Link
pankov
Registered User
 
Join Date: Mar 2002
Location: Sofia, Bulgaria
Posts: 661
Quote:
Originally Posted by madshi View Post
The official build has a bug which can sometimes introduce a bit of banding when doing HDR -> SDR conversion. That's already fixed in this test build from AVSForum:

http://madshi.net/madVRhdrTest5.rar

This build still shows some weirdness for the blue, but the other colors should be fine. I'm not completely sure why the problem with blue happens, will have to investigate.
Confirmed - almost perfect except the blue.

Quote:
Originally Posted by madshi View Post
I've said it a thousand times. 8bit doesn't introduce banding. Even 4bit doesn't introduce banding. That's what we have dithering for.
Principally I do agree with you that dithering will help a lot and that's why I mostly don't mind going back to 8bit for both SDR and HDR with the current fiasco with NVidia/Windows support but I'm kind of reluctant to lose that many gradation steps by enveloping every SDR BT.709 in BT.2020 ... or DCI-P3 ... or am I talking nonsense ... again?

Quote:
Originally Posted by madshi View Post
HDR calibration is a mess. IMHO the best workaround is probably to let madVR convert HDR to SDR, and then you can just run the converted SDR video through a conventional SDR 3dlut.
But I want HDR on my TV ... and I want it to automatically switch to HDR mode with its increased OLED Light level and wider color gamut, etc ... I think ... and to be honest I don't have any experience with 3dluts. I'm kind of lost now and I might need some help to start from scratch with all this SDR/HDR/BT.2020/DCI-P3 mix. There are so many "variables" on the PC side and almost as many on the TV side ... especially with my latest finding that in "PC" mode (on LG's 2017 OLEDs this is accomplished by changing the icon for this HDMI input) I get some dull colors *and* banding which vanishes if I simply change the icon for this input in the TV settings ... but then I loose 4:4:4 support ... and I'm mad ... lost.


Quote:
Originally Posted by madshi View Post
That depends. Are we talking about passing HDR through to the display? Or letting madVR convert HDR to SDR?
I'm talking about HDR passthrough mode (to my LG B7 OLED) ... and I think I must have mentioned it before to avoid the confusion. I guess my totally unrelated complaint about the HDR to SDR conversion banding mislead you. Sorry.


Quote:
Originally Posted by madshi View Post
Just double click on the madHcCtrl tray icon.
Yes, I know I can do this but in order to do it I have to exit fullscreen mode and then go back in which temporarily alters the scaling algorithms and resets the ques (drops frames, etc) and ... I think ... it's not convenient at all - that's why I asked if there is a real reason the have the small window in-between the player's filter menu and the actual target. I've lived with it so many years that I've got used to it but in the last few days I've been improving my HTPC cooling and I've been reconfiguring madVR alot and it was quite a nag.
__________________
Z370M Pro4 | i3-8100 | 16GB RAM | 256GB SSD + 40TB NAS
NVIDIA GTX 1060 6GB (385.28) | LG OLED65B7V
Win 10 64bit 1803 + Zoom Player v14
pankov is offline   Reply With Quote
Old 9th April 2018, 00:49   #50155  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Quote:
Originally Posted by pankov View Post
Yes, I know I can do this but in order to do it I have to exit fullscreen mode
Just press Ctrl+S
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 9th April 2018, 01:30   #50156  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
Quote:
Originally Posted by madshi View Post
I don't know, I've only tested the one driver I have currently installed (390.65). It's 100% reproducable for me with this driver. I get banding free 10bit output in SDR, but as soon as I activate Nvidia's HDR passthrough, I get banding when I try to render to 10bit.
I was able to reproduce this with 391.35 but only when I had the mouse cursor over the display. In 10-bit FSE the mouse did not cause banding and fullscreen windowed 10-bit looked the same as FSE when the mouse cursor was moved to my other monitor. This did not happen when in SDR mode.

Quote:
Originally Posted by pankov View Post
especially with my latest finding that in "PC" mode (on LG's 2017 OLEDs this is accomplished by changing the icon for this HDMI input) I get some dull colors *and* banding which vanishes if I simply change the icon for this input in the TV settings ... but then I loose 4:4:4 support ... and I'm mad ... lost.
Yes. This annoyed me greatly. I can use a 3DLUT but the color volume is so much smaller that it clips sometimes and it doesn't look as good as passthrough in non-PC mode.

This means that, for me at least, LG's 2017 OLEDs can only do HDR in 4:2:2 and I have to go into the inputs menu and change the icon when using HDR. Input lag goes up a bunch too because only game mode, with its terrible white point, has low input lag if it isn't icon'ed "PC".
__________________
madVR options explained

Last edited by Asmodian; 9th April 2018 at 02:02.
Asmodian is offline   Reply With Quote
Old 9th April 2018, 01:40   #50157  |  Link
pankov
Registered User
 
Join Date: Mar 2002
Location: Sofia, Bulgaria
Posts: 661
Quote:
Originally Posted by Manni View Post
Okay, do you have a preferred pattern/clip to test banding? I use many but I'd like to be sure the same test you use is passed/fails here.
Since the discussion is more about 10bit support than banding you might try Mehanik's HDR10 test patterns that I wrote about a few posts ago. There is a dedicated section (folder) exactly for this - "03. Grayscale" -> "04. 10bit test".
Using these I think I confirmed for myself that LG OLED's "PC Mode" when used with HDR content is total garbage. Yes we do get 4:4:4 chroma support but at the cost of washed out colors and lack of 10bit gradation.
Again using these video files I think I can say that NVidia's conversion from madVR's 10bit output to 8bit display output in 2160p50/60 is not at all bad. The reason I say this is that as we all know the HDMI 2.0 spec doesn't support RGB 2160p at 50/60Hz 10/12bit and the drivers silently switch to 8bit behind madVR's back but nonetheless I do see some gradation and not clipping which might mean that they are doing some dithering ... or some other kind of "magic".


Quote:
Originally Posted by el Filou View Post
Just press Ctrl+S
If only I had a keyboard around

I'm kidding ... kind of ... Thank you for reminding me about this shortcut (a bit generic but still working) but to be honest very often I control/re-configure my player and/or direct show filters only using the mouse (a one from Logitech that has horizontal scrolling and a couple of additional buttons) since that's more convenient and because the keyboard for the HTPC is usually hidden in some drawer.
__________________
Z370M Pro4 | i3-8100 | 16GB RAM | 256GB SSD + 40TB NAS
NVIDIA GTX 1060 6GB (385.28) | LG OLED65B7V
Win 10 64bit 1803 + Zoom Player v14
pankov is offline   Reply With Quote
Old 9th April 2018, 02:15   #50158  |  Link
pankov
Registered User
 
Join Date: Mar 2002
Location: Sofia, Bulgaria
Posts: 661
Quote:
Originally Posted by Asmodian View Post
This means that, for me at least, LG's 2017 OLEDs can only do HDR in 4:2:2 and I have to go into the inputs menu and change the icon when using HDR.
To be honest ... after some additional tests ... I think that even 10bit SDR is not OK from "banding" (>8bit support) point of view ... or it might be just the 12bit that NVidia offers !?!? May be in PC mode LG have disabled all/most of the additional processing and since the display is native 10bit if it's fed with 12bit signal it doesn't downconvert it in high quality ... this is a wild speculation on my part but sounds logical to me.
Does anybody know if Intel (HD4000) can output 10bit 2160p?
... or is there a user with AMD and one of the 2017 LG OLEDs here that can test this for us?
__________________
Z370M Pro4 | i3-8100 | 16GB RAM | 256GB SSD + 40TB NAS
NVIDIA GTX 1060 6GB (385.28) | LG OLED65B7V
Win 10 64bit 1803 + Zoom Player v14
pankov is offline   Reply With Quote
Old 9th April 2018, 04:20   #50159  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
I believe these TVs simply suck with >8bit input but I do get noticeably more banding in the situation I described above. As I move the mouse back and forth with paused video some banding appears and disappears. There is still some, more than in 8-bit mode, but that is due to the TV itself instead of this issue.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 9th April 2018, 08:08   #50160  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by madshi View Post
Strange. I'm not sure exactly how JRiver draws subtitles. I think it uses one of my OSD interfaces, but I'm not sure which one.
We use the ISubRenderProvider interface
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 05:49.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.