Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th December 2017, 00:44   #47781  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Manni View Post
The 2015+ JVC projectors have been confirmed to handle 12bits from the input to the panels. I don't care about flat panels so no idea if it exists for TVs or not.
Quote:
Originally Posted by mclingo View Post
ok, so if there is no point using 10 bit do people recommend we all drop back to 8 bit and deploy 4:4:4 full RGB on our TV, how will this effect the playback of 10 bit HDR material?
i can't remember but if i'm not mistaken AMD needs 10 bit for what ever reason or it will not send HDR.

so it is your choice if you go 4:2:2 10 bit or RGB 8 bit for 60 HZ which will be double dithered(and i tested the amd dithering to 6 bit is is pretty much random dithering not bad but not good too 10->8 bit should be the same but hard to test because it looks very similar)

windows 10 HDR shoudl work at 8 bit even with AMD cards.
Quote:
Originally Posted by Manni View Post
The 2015+ JVC projectors have been confirmed to handle 12bits from the input to the panels. I don't care about flat panels so no idea if it exists for TVs or not.
it's hard to find a screen that doesn't support 12 bit input even TV from 2012 and older can do this easily.
and i wonder how people confirm stuff like this...
huhn is offline   Reply With Quote
Old 18th December 2017, 00:46   #47782  |  Link
mitchmalibu
Registered User
 
Join Date: Mar 2009
Posts: 37
It's frankly not that difficult to test for yourself ... I watched a few HDR movies in 4:4:4 12bits and RGB 8bit, and frankly, without using comparison shots, I wouldn't be able to see any difference on a 2016 LG 4k OLED panel. 8bit RGB with a custom resolution to avoid frame drop seems like the better alternative from what I personally tried.
What madshi and others said makes sense : better reduce the number of conversions and drop down the bitdepth than try to push for max bitdepth and induce who knows how many post processing treatments on which you don't have any control.
Now, if only Nvidia gave us the possibility to make proper custom resolutions other than 8bit RGB ...
__________________
OS: Win10 1703
GPU: GTX 1070 (latest stable drivers)
Monitor: LG OLED55B6V TV / Yamaha RX-A860 AVR
Media setup: MPC-BE x64, madvr, lav filters (nightly)
mitchmalibu is offline   Reply With Quote
Old 18th December 2017, 02:09   #47783  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by Asmodian View Post
You still do not need a >8-bit output for displaying HDR content without banding if you have high quality dithering. 8-bit output is a little nosier but that noise is very subtle. Keep 10-bit output from madVR and 12-bit from the GPU but don't expect it to make a significant difference.

Can you tell the difference between 8-bit and 10-bit from madVR with ordered dithering enabled (with the GPU and projector settings the same)?
I don't want to dither if I don't have to. So yes, I prefer to keep 10bits output over 12bits GPU for 10bits HDR playback |(except for 60p where the driver drops automatically to 8bits). I don't see why I should add some noise when I don't have to.

I remember testing a while ago and the difference was subtle, but it was there for 10bits HDR content. When I tried to see a difference with bluray content (8bits SDR), I couldn't see any difference between 8bits and 10bits. I don't have the time to do more pixel peeping just because some prefer to watch HDR 10bits content dithered to 8bits.

I want as close as the "pure direct" equivalent as I can get . I do agree though that it's not the end of the world if you drop to 8bits, but I want the same PQ as my standalone player (or better) with MadVR, not worse.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 18th December 2017 at 02:13.
Manni is online now   Reply With Quote
Old 18th December 2017, 02:16   #47784  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
We are not saying we prefer dithering 8 bit to 10 bit, only that no one can tell the difference all else being equal.

The point is only that you should not sacrifice anything for 10 bit output, not that 8 bit output is better.

Edit: And you always have to dither, you simply dither to 10 bit or 12 bit instead of dithering to 8 bit.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 02:50   #47785  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by huhn View Post
i can't remember but if i'm not mistaken AMD needs 10 bit for what ever reason or it will not send HDR.

so it is your choice if you go 4:2:2 10 bit or RGB 8 bit for 60 HZ which will be double dithered(and i tested the amd dithering to 6 bit is is pretty much random dithering not bad but not good too 10->8 bit should be the same but hard to test because it looks very similar)

windows 10 HDR shoudl work at 8 bit even with AMD cards.


it's hard to find a screen that doesn't support 12 bit input even TV from 2012 and older can do this easily.
and i wonder how people confirm stuff like this...
I don't know about AMD, I have nVidia 1080ti (see my sig). It's perfectly possible to send 8bits or 12bits to get HDR with nVidia. It is not possible to use 4:2:2 10bits with nVidia. It's 8bits or 12bits.

Therefore there is no need to use 4:2:2 with nVidia. I use 12bits 4:4:4 for 99.99% of films, and 8bits 4:4:4 for the odd 60p film such as Billy Lynn.

I did tell you that it was not only the input but all the chain up to the panels that was 12bits. I don't have the reference here but it has been confirmed. It wasn't the case for the pre-2015 JVCs, which had 12bits inputs but 10bits panels.

Quote:
Originally Posted by Asmodian View Post
We are not saying we prefer dithering 8 bit to 10 bit, only that no one can tell the difference all else being equal.

The point is only that you should not sacrifice anything for 10 bit output, not that 8 bit output is better.

Edit: And you always have to dither, you simply dither to 10 bit or 12 bit instead of dithering to 8 bit.
I am not sacrificing anything, so I don't see why I should use 8bits. And yes, I meant that I don't want to dither to 8bits if I don't have to, I thought the context would make it clear given that we have already discussed many times that MadVR's output is in 16bits before dithering.

I suggest we discuss this again when you have been able to compare 8bits vs 10bits output on native 10/12bits panels. Until then, I'm out.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is online now   Reply With Quote
Old 18th December 2017, 05:44   #47786  |  Link
Oguignant
Registered User
 
Oguignant's Avatar
 
Join Date: Nov 2016
Posts: 181
Hi guys, question... Is there any way that when Madvr automatically changes to 23hz also change the bit depth to 12 bits? with the old Nvidia drivers I could do it. Is there any other software that I can use in Windows 10 and combine it with Madvr or run independently from Madvr?

Thks
__________________
"To infinity, and beyond!"
Oguignant is offline   Reply With Quote
Old 18th December 2017, 07:28   #47787  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by Manni View Post
I am not sacrificing anything, so I don't see why I should use 8bits.
You should not. If you don't have to sacrifice anything use 10 bit, of course.

Quote:
Originally Posted by Manni View Post
I suggest we discuss this again when you have been able to compare 8bits vs 10bits output on native 10/12bits panels. Until then, I'm out.
I am using a 2017 LG OLED, which is native 10 bit.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 07:33   #47788  |  Link
Oguignant
Registered User
 
Oguignant's Avatar
 
Join Date: Nov 2016
Posts: 181
Quote:
Originally Posted by Asmodian View Post

I am using a 2017 LG OLED, which is native 10 bit.
I have LG OLED 2017 too. These panels are 10 or 12 bits?
__________________
"To infinity, and beyond!"
Oguignant is offline   Reply With Quote
Old 18th December 2017, 07:35   #47789  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
10 bit
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 07:40   #47790  |  Link
Oguignant
Registered User
 
Oguignant's Avatar
 
Join Date: Nov 2016
Posts: 181
Quote:
Originally Posted by Asmodian View Post
10 bit
I thought it was 12 bits because it supported dolby vision
__________________
"To infinity, and beyond!"
Oguignant is offline   Reply With Quote
Old 18th December 2017, 08:08   #47791  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
there are 8 bit panels that can do dolby vision.

LG never confirmed more than 10 bit internal processing. they even featured this as a new "thing".

WRGB OLED can't be properly compared anyway it needs heavy processing to even get an image on the screen and if the sub pixel are really run with 10 bpc that would mean this screen has 40 bpp.
huhn is offline   Reply With Quote
Old 18th December 2017, 08:21   #47792  |  Link
Oguignant
Registered User
 
Oguignant's Avatar
 
Join Date: Nov 2016
Posts: 181
Quote:
Originally Posted by Oguignant View Post
Hi guys, question... Is there any way that when Madvr automatically changes to 23hz also change the bit depth to 12 bits? with the old Nvidia drivers I could do it. Is there any other software that I can use in Windows 10 and combine it with Madvr or run independently from Madvr?

Thks
Nothing about this?
__________________
"To infinity, and beyond!"
Oguignant is offline   Reply With Quote
Old 18th December 2017, 08:38   #47793  |  Link
edcrfv94
Registered User
 
Join Date: Apr 2015
Posts: 84
Quote:
Originally Posted by madshi View Post
In case you guys wonder how NGU Sharp compares to mpv's latest FSRCNN(X), here's a little comparison:

Blu-Ray screenshot
downscaled (PNG) | (JPG, 100 quality)
latest FSRCNN32
latest FSRCNNX32
NGU Sharp - High
NGU Sharp - Very High

To make things as fair as possible I've downscaled the image with Bicubic/Catrom, which is exactly what FSRCNN and FSRCNNX were trained for.

Here are benchmark numbers, for 720p doubling:

Code:
Nvidia 1070:
FSRCNN16: 15.270 ms
FSRCNNX16: 26.397 ms
FSRCNN32: 46.290 ms
FSRCNNX32: ? (estimated: 80.021 ms)
NGU-Sharp High: 3.940 ms
NGU-Sharp Very High: 11.800 ms
Code:
AMD 560:
FSRCNN16: 14.289 ms
FSRCNNX16: 24.412 ms
FSRCNN32: 45.235 ms
FSRCNNX32: ? (estimated: 77.282 ms)
NGU-Sharp High: 12.970 ms
NGU-Sharp Very High: 37.100 ms
These are very weird benchmark results, to say the least. We know that NGU doesn't run as well as it should on AMD Polaris GPUs. But FSRCNN(X) running (ever so slighty) faster on my AMD 560 than on my Nvidia 1070 is just plain weird.
Look well, very expect NGU can be used at AviSynth also can NGU use for Deinterlacers to replace nnedi3(QTGMC)?

Last edited by edcrfv94; 18th December 2017 at 11:06.
edcrfv94 is offline   Reply With Quote
Old 18th December 2017, 09:55   #47794  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Not yet, NGU is currently only in madVR.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th December 2017, 10:26   #47795  |  Link
petran79
Registered User
 
Join Date: Aug 2007
Posts: 87
on W10 suddenly there is tearing in Full Screen mode, even with Exclusive Mode enabled. Doesnt happen with EVR (Potplayer) or VLC.
If I disable DX11 renderer in Madvr and switch to Fullscreen, screen becomes black with sound on the background. Player freezes and I have to kill the app via taskmanager.
Using Nvidia gtx1060 and gsync monitor

I even clean reinstalled the drivers
petran79 is offline   Reply With Quote
Old 18th December 2017, 13:09   #47796  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
People managing to get 10bit / 12 bit on FULL RGB 4:4:4, are you using monitors rather than TV/HDMI/ Receivers?, when I choose 4:4:4 full RGB I lose both the 12 bit and 10 bit options in my AMD control Panel suggesting its just not possible to get 10/12 BIT FULL RGB on current HDMI specs.
mclingo is offline   Reply With Quote
Old 18th December 2017, 14:51   #47797  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by Oguignant View Post
Hi guys, question... Is there any way that when Madvr automatically changes to 23hz also change the bit depth to 12 bits? with the old Nvidia drivers I could do it. Is there any other software that I can use in Windows 10 and combine it with Madvr or run independently from Madvr?

Thks
With Windows 10 you need to use 385.28. Select 4K23p, 12bits, RGB full, 4:4:4, apply. Then if the content goes above 30p, the driver will automatically switch to 4K60p 8bits 4:4:4, which is what you want. It doesn't work the other way around.

Quote:
Originally Posted by mclingo View Post
People managing to get 10bit / 12 bit on FULL RGB 4:4:4, are you using monitors rather than TV/HDMI/ Receivers?, when I choose 4:4:4 full RGB I lose both the 12 bit and 10 bit options in my AMD control Panel suggesting its just not possible to get 10/12 BIT FULL RGB on current HDMI specs.
No idea about AMD, for nVidia see above. I'm using a JVC projector and have no issue to get what's in my sig, at least at 23-30p.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is online now   Reply With Quote
Old 18th December 2017, 15:38   #47798  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
Quote:
Originally Posted by Manni View Post
With Windows 10 you need to use 385.28. Select 4K23p, 12bits, RGB full, 4:4:4, apply. Then if the content goes above 30p, the driver will automatically switch to 4K60p 8bits 4:4:4, which is what you want. It doesn't work the other way around.



No idea about AMD, for nVidia see above. I'm using a JVC projector and have no issue to get what's in my sig, at least at 23-30p.
My TV is a 10 bit panel but I can only select 10bit in 420 mode, when I put my TV in 444 full RGB 10bit is missing.

Although i've not tried it for a while and there have been a few firmware updates, might try it again.
mclingo is offline   Reply With Quote
Old 18th December 2017, 16:09   #47799  |  Link
Clammerz
Registered User
 
Join Date: Aug 2005
Posts: 54
Quote:
Originally Posted by Manni View Post
I suggest we discuss this again when you have been able to compare 8bits vs 10bits output on native 10/12bits panels. Until then, I'm out.
I don't think this topic was particularly relevant for you since you know exactly what your video pipeline is doing.

By "I don't want to dither if I don't have to", I assume you mean "dither further than I have to", since you'll need to dither down to 10bit from MadVR's higher internal representation.
But it's those kinds of wordings that made me bring this topic up in the first place. If someone were to assume "10bits, great. Just like my input" as the take away message, they could be dithering more than needed because something in their pipeline downconverted along the way.

It was more a cautionary mention rather than "everyone needs to not do this", but I think it got a little muddled along the way. Oops.
Clammerz is offline   Reply With Quote
Old 18th December 2017, 16:48   #47800  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by mclingo View Post
My TV is a 10 bit panel but I can only select 10bit in 420 mode, when I put my TV in 444 full RGB 10bit is missing.

Although i've not tried it for a while and there have been a few firmware updates, might try it again.
Your TV might have a 10.2gb HDMI max bandwidth, or your cables might not be good enough to pass through more than that. Make sure you select 23p when trying any chroma/bit depth combination, in order to make sure you don't cause a bandwidth issue (or at least you limit it).
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K
Manni is online now   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 15:10.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.