Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
1st October 2020, 21:24 | #60261 | Link |
Registered User
Join Date: Dec 2018
Posts: 207
|
Just for information (AMD).
Recently I noticed in the OS HDR mode the metadata became correct.
Pic
__________________
R3 3200G / Vega8 / Samsung UE40NU7100 Win11Pro 21H2 / 4K RGB 59Hz / AMD last driver MPC-HC 1.9.17 / madVR 0.92.17 / FSW / SM / 8bit |
2nd October 2020, 01:45 | #60262 | Link | |
Registered User
Join Date: May 2013
Posts: 714
|
Quote:
On all the Roku platform TVs I've tested, on Test patterns, 10bit @ 23.976hz always produced smoother gradient than 8bit modes. <setting is 12bit mode on NVpanel, 10bit dx11 surface in madvr> My samsung Tv won't support 10bit 444 from PC. no go here. I've got an older 10bit 24" IPS u2410 which also has smoother gradient when set to 10bit mode.
__________________
Ghetto | 2500k 5Ghz |
|
2nd October 2020, 12:08 | #60263 | Link | ||
Registered User
Join Date: Oct 2016
Posts: 896
|
Quote:
Quote:
Did you try fiddling with the GPU settings? Exclusive mode is like games, so some GPU settings may influence it. Some stuff like double/triple buffering, enhanced/fast sync, that kind of stuff? I don't game much so don't know about all of them. Maybe do a system restore checkpoint to go back to and then try resetting the graphics driver settings?
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40 |
||
2nd October 2020, 15:31 | #60264 | Link | ||
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Quote:
Quote:
Make sure linear light dithering is enabled.
__________________
madVR options explained |
||
2nd October 2020, 16:20 | #60265 | Link | |
Registered User
Join Date: Jan 2013
Posts: 8
|
Quote:
|
|
3rd October 2020, 00:45 | #60266 | Link | |
Registered User
Join Date: May 2013
Posts: 714
|
Quote:
Mode 1 Madvr Dithering set off. All 8 bit pipe set (Nvidia+Madvr), the test pattern 10bit gradient portion across 3840 pixels is choppy 8bit stripes. M2 Madvr Dithering set off. 12bit NV, 10bit Madvr, 4K@23.976hz the test pattern looked smooth. Slight segmentation visible M3 Dithering on (erDiff). 8 bit pipe set (Nvidia+Madvr), the gradient IS SMOOTH, however. near the dark tones, I can still see slight segmentation, but the edges are softer. Vs M2, It's smoother than M2, so Madvr's dithering is indeed quite good. M4 Dithering on (erDiff). 12bit NV, 10bit Madvr, 4K@23.976hz, I can just barely make out the segmentation. In M4, shadow detail, near black transitions are much better. I noticed the film grain also looked more pronounced, I can't explain this as I would've thought the dithering would kill that off.
__________________
Ghetto | 2500k 5Ghz Last edited by tp4tissue; 3rd October 2020 at 00:47. |
|
3rd October 2020, 18:19 | #60267 | Link |
Registered User
Join Date: Dec 2016
Posts: 212
|
I can see it's all about bits around here. So I'll jump right in.
I got myself a Lenovo d32q-20 monitor with 10 bit panel (8bit+frc). However, I'm not offered 10 bit color depth from the graphics card, neither RX 580's Radeon Software nor GTX960 Nvidia Panel. However, again, when I play 10 bit gradient test (both mp4 or TIF) through MadVR, I get very smooth gradation in D3D11 full screen on RX 580 (both windowed and exclusive) with MadVR's dithering off, while I do see striping when playback is not fullscreen. Does that mean that 10 bit pipeline is somehow passed through to the monitor avoiding the windows bit depth settings or there is some hardware dithering down to 8 bit by the GPU. I vaguely remember there was some talk about AMD doing the dithering by itself. I'm yet to test the GTX960 but some thoughts on the matter be nice in the meantime. By the way, did Madshi really abandon MadVR? What's he doing now, some hardware implementation of it? |
3rd October 2020, 21:02 | #60268 | Link |
Registered User
Join Date: May 2013
Posts: 714
|
In non-fullscreen mode, it goes to the 8 bit pipe. for example when you pop the menu up, or when you right click.
Hit Ctrl J, and you can see it go in and out on the top left indicator. This has nothing to do with AMD. AMD's advantage is it does dithering when 10bit is enabled, which means it will be a bit smoother than nvidia. This is especially true if you have a Color correction gamma table loaded. AMD will also do dithering in 8-bit mode, if you have a table loaded. But this is not useful for Madvr, because if you have a colorimeter, madvr can take care of the entire color pipe more reliably through its 3DLUT color correction support.
__________________
Ghetto | 2500k 5Ghz Last edited by tp4tissue; 3rd October 2020 at 21:10. |
3rd October 2020, 22:37 | #60269 | Link |
Registered User
Join Date: Dec 2016
Posts: 212
|
@tp4tissue: I must admit I'm not sure I can follow you about the Gamma Correction color table but I do have a 3D LUT loaded that I had created with DisplayCAL. Gamma table = 1D LUT?
I have just introduced "DP_DisableDither=1" value to the Windows registry's AMD driver section and I now have banding/striping in fullscreen mode as well...so it was not a full 10-bit pipeline file-to-monitor after all but AMD's default dithering. Why the hell this monitor has 10 bit panel when it can't take a 10 bit signal? Last edited by mytbyte; 4th October 2020 at 00:35. |
4th October 2020, 00:34 | #60271 | Link | |
Registered User
Join Date: Dec 2016
Posts: 212
|
Quote:
What specific MadVR settings would you like to know? The monitor properties is set to "10 bit or higher", "enable automatic fullscreen exclusive mode" (though it's equally smooth with windowed fullscreen), D3D11 for presentation, no dithering. Seems like AMD dithering activates in full screen only. |
|
5th October 2020, 20:24 | #60272 | Link | |
Registered User
Join Date: May 2013
Posts: 714
|
Quote:
We need another app which supports 10 bit pipe to test this. I don't think amd works with 10bit opengl on photoshop. hrrrm...
__________________
Ghetto | 2500k 5Ghz |
|
5th October 2020, 21:46 | #60273 | Link | |
Registered User
Join Date: Dec 2016
Posts: 212
|
Quote:
You are right - even if the monitor were truly 10-bit input-to-output, I wouldn't be able to achieve 10-bit with AMD consumer GPU. And the GTX 960 (Maxwell, prior to Pascal) is exempt from new Nvidia's Studio driver's ability to make use of 10 bit OpenGL used in most productivity apps. Last edited by mytbyte; 5th October 2020 at 21:55. |
|
5th October 2020, 22:30 | #60274 | Link |
Registered User
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
|
Just to add to this: Nvidia drivers do not dither when using RGB full. They dither with RGB limited and YCbCr.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex |
6th October 2020, 09:50 | #60275 | Link | |
Registered User
Join Date: Dec 2016
Posts: 212
|
Quote:
And now another thing: I just updated the GTX960 machine to Windows 2004 build and 10-bit and 12-bit options magically appeared, but for HDMI YCbCr 4:2:2 only (DP still 8 bit only). I can't absolutely swear it wasn't there before but I believe I had tried everything and don't believe I missed it. AMD machine didn't get the 2004 update and is still oblivious to 10 bit in any form. |
|
6th October 2020, 12:30 | #60276 | Link |
QB the Slayer
Join Date: Feb 2011
Location: Toronto
Posts: 697
|
Just a heads up and an FYI... but CRU will also limit output to 8-bit only.
Edit: Actually I went back and read my post about custom resolutions (https://forum.doom9.org/showthread.p...31#post1895131) and I guess it would be the lack of HDMI standard timings that could be limiting output to 8-bit... So I will change my above statement slightly... The EDID can have a big role to play in the bitdepth of the output. EDIT 2: @mytbyte, maybe you can go backwards from what I was attempting back then and add those standard timings back in, if they are missing from your EDID... QB
__________________
Last edited by QBhd; 6th October 2020 at 12:42. |
6th October 2020, 14:29 | #60277 | Link |
Registered User
Join Date: Nov 2017
Posts: 69
|
I am looking to get a Lenovo Yoga S740 15, with a GeForce GTX 1650 with Max-Q graphics card.
I was wondering what this card is capable of, using madVR and MPC-HC, scaling 480p and 720p to it's own 1080p screen? Will I get any decent quality from this material or shouldn't I be buying a machine with this card? Thank you in advance! |
6th October 2020, 16:22 | #60279 | Link | |
Registered User
Join Date: Dec 2016
Posts: 212
|
Quote:
What would be the freaking reason why the monitor doesn't advertise 10 and 12 bit support in EDID? Last edited by mytbyte; 6th October 2020 at 16:34. |
|
6th October 2020, 19:01 | #60280 | Link |
Registered User
Join Date: Mar 2002
Posts: 2,323
|
Maybe it's only 8bit? Can you link where you see it's 10bit? And some panel Ids as well e.g. from madvr.
__________________
Ryzen 5 2600,Asus Prime b450-Plus,16GB,MSI GTX 1060 Gaming X 6GB(v398.18),Win10 LTSC 1809,MPC-BEx64+LAV+MadVR,Yamaha RX-A870,LG OLED77G2(2160p@23/24/25/29/30/50/59/60Hz) | madvr config |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|