Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
22nd November 2020, 19:52 | #60741 | Link | ||
Registered User
Join Date: Oct 2016
Posts: 896
|
If Alec246 is using a 1440p and playing back 2160p videos, then it should only use chroma upscaling then downscaling, and in that case it's better to lower the chroma upscaling quality a bit than choosing a lower quality downscaling.
Quote:
Quote:
So if I understand correctly, you're saying that by outputting HDR as if it is SDR and then manually forcing the mode on your display, you don't have crushed blacks anymore? As huhn said, if your projector is running in HDR mode it has to tonemap, it's just that the quality of its tonemapping is most probably of lesser quality than madVR's tonemap.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40 Last edited by el Filou; 22nd November 2020 at 20:07. |
||
22nd November 2020, 20:20 | #60742 | Link | |
Registered User
Join Date: Dec 2012
Posts: 3
|
Quote:
Sorry forgot to show you my Settings |
|
22nd November 2020, 20:43 | #60743 | Link | |
Registered User
Join Date: Oct 2008
Posts: 168
|
Quote:
When the video was in Fullscreen windowed 10bit mode - the blacks are crushed - and I have to adjust my MadVR range output to 19 for the black instead of the normal 16 for limited. However, if I alt-enter and make the video windowed, it switches to 8 bit windowed, and then shows all the black bars. So the black crush is happening on Windowed Fullscreen 10 bit. Does that help explain anything? Also a question on MadVR tonemapping and HDR output. I remember Asmodian saying that he used MadVR to tonemap, but then output the video as HDR for his screen. Why would you do this? To take advantage of the full BT.2020 (or DCI-P3) color scheme? |
|
22nd November 2020, 21:32 | #60744 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Don't use 10 bit.
I usually use passthrough but the reason to tone map HDR is to use madVR's tone mapping instead of my TV's. Both methods take advantage of the full color gamut but madVR's is customizable and potentially better. If you use madVR's tone mapping but do not output HDR the display will stay in SDR mode, on a projector this is fine but with an OLED TV switching to HDR mode offers a much better HDR image.
__________________
madVR options explained Last edited by Asmodian; 22nd November 2020 at 21:45. |
22nd November 2020, 21:52 | #60745 | Link | |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
Quote:
my line is more about how HDR works even a 1500 display is tonemapping even a 4000k display is tone mapping. so the question is not if something is tone mapped it is who is tone mapping. just send it the TV nothing wrong with a 1000k+ display. it has nothing todo with colorspace nothing to do with levels. |
|
22nd November 2020, 22:00 | #60746 | Link | |
Registered User
Join Date: Oct 2008
Posts: 168
|
Quote:
If I am watching HDR10 (BT.2020) content, on SDR (BT.709) how am I taking advantage of the full color gamut? Man I hate being a noob! hahaha |
|
22nd November 2020, 22:13 | #60747 | Link | |
Registered User
Join Date: Dec 2012
Posts: 3
|
Quote:
Chroma Upscalign should I try to keep NGU? And I'm trying SSIM 2d Downscale but I may have to go back to 1D |
|
22nd November 2020, 22:36 | #60749 | Link |
Registered User
Join Date: Oct 2018
Posts: 324
|
The upscaling settings are not used in your config, try Bicubic60 to downscale, I doubt that you can tell the difference.
__________________
AviSynth AiUpscale |
22nd November 2020, 22:51 | #60750 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
Set your display to 8 bit, under devices->[display]->properties
madVR's dithering is very good, even ordered dithering which is very fast outputs great quality 8 bit video from HDR10 content. The high bit depth is important for the compressed source video (the compression does not preserve dithering) but not for the full bandwidth output of madVR. I do prefer both the "use colored noise" and "change dither for every frame" disabled. I especially do not like "use colored noise". Quote:
Color gamut is exactly what color the primary colors are (Red, Green, Blue). They are a property of your display and you need to convert the source to be correct on your display. BT.709's max red is a less saturated red than DCI-P3's max red but with a good DCI-P3 to BT.709 conversion all but the most saturated colors will look the same. HDR is a different and more dynamic way to do what was done by SDR's gamma value. This is to map the digital values to actual pixel brightness. Because humans see differences in dark shades much more significantly (our visual response is logarithmic) we change the brightness of the display a lot less between (16,16,16) and (17,17,17) than between (128,128,128) and (129,129,129). HDR allows this map to be dynamic because the display has more information about the content to be able to convert the brightness range of the master into the range the display can actually present. The point is basically to be able to compress the very bright highlights, like point light sources, a lot while keeping the normal brightness range uncompressed so it doesn't look low contrast in most of the scene. This is assuming the brightness of the mastering display is higher than the consumer display but it is possible to use the HDR metadata to tone map to a brighter display as well. Everyone was a noob at somepoint. Try enabling the trade quality for performance option "scale chroma separately, if it saves performance". In the right situation it can save a lot of performance, arguably without a quality hit (the output is slightly different but not necessarily lower quality).
__________________
madVR options explained Last edited by Asmodian; 22nd November 2020 at 22:58. |
|
22nd November 2020, 23:16 | #60751 | Link |
Registered User
Join Date: Jul 2016
Posts: 171
|
Hi all,
I just upgraded my GTX 1070 to the new RTX 3070. I've been using madVR for many years without any issue (SDR or HDR). My htpc is an i7 7700K, 16 gb ram, nvme drive for windows 10 up to date, latest Nvidia drivers, MPC-BE 64bits. My TV is an LG OLED C8. I'm having this very strange issue with SDR content only. The motion is not good at all, it feels like I'm dropping frames but madVR OSD does not report any dropped frames( nor repeated frames). What is strange is that in HDR10 passthrough, the playback is perfectly smooth. it only happens with sdr content, how's that possible ? Playback is also perfectly fluid using EVR mpc or VLC. Anyone have an idea why? everything worked fine on my gtx1070. I tried: -reinstalling driver many times - tried reinstalling madvr -tried different version of madvr -older driver ( there is only 2 drivers yet) -mpc-hc = same issue -gpu usage does not go over 50% -happens with 2160p sdr content as well (no upscaling) - i think its worse with NGU but i still notice it with jinc and spline. - again, HDR passthrough, no issue at all... osd of files playing: * hdr passthrough give black screenshot but the osd is accurate settings: thanks for any help. Last edited by imhh11; 22nd November 2020 at 23:26. |
22nd November 2020, 23:35 | #60752 | Link | |
Registered User
Join Date: Oct 2008
Posts: 168
|
Quote:
So if I am using the 12bit setting on the Nvidia output, and the full windowed D3D11 shows 10 bit - that would mean that I am NOT using dithering correct? Just straight up 10bit output? I have seen through random web searches that people say 10bit can show more banding than 8bit w/ dithering. Is that correct? Or source dependent? I am using my reference movie LUCY to see the differences that I select. So, with 12bit RBG, the movie shows 10bit D3D11 in full screen (which has proper blacks now, dont ask - I have no idea what I clicked!)... but when I go down to the movie timer scroll bar, and hover the mouse there, CTRL+J shows that the movie changes to 8bit windowed. I notice no difference between the switch. That being said, should I stick with the 12bit setting? Thereby getting true 10bit fullscreen windowed? Is there a good 4K HDR Banding TEST video I could use to see the difference between the modes? |
|
23rd November 2020, 00:56 | #60753 | Link |
Registered User
Join Date: May 2013
Posts: 712
|
In order to take advantage of the 12bit setting, your display must actually support 10bit input and have an appropriate frc engine to deal with it.
Alot of displays just fake it even though they list it on the box. They'll write 10bit on the box, but then the display itself just discards the data and doesn't dither. To test it, Set 10 or 12bit in Nvidia panel, set 10bit in Madvr , set a hotkey in madvr to Toggle madvr's dithering. Play this test pattern. it must be full screen, mouse cursor not ontop of menu, and the surface must say D3D11 fullscreen windowed (10bit), Then toggle the dithering button on and off. With dithering Disabled, The top gradient should be much smoother than the bottom gradient. With dithering Enabled, The top gradient should change yet again, and become EVEN MORE smooth. If the top gradient looks very similar with big blotches like the bottom gradient in both cases, either the screen is not 10-bit, or is just crappy. With the cursor on the bottom seekbar/buttons poped up, the application will switch to the 8bit mode. In this mode, with the dithering Disabled, the top gradient should look almost like the bottom gradient, blocky. With it enabled, the top should be smooth/smoother. It's also possible that the display has 10bit processing, but this aspect is disabled in 444 chroma modes/ PC modes. You can run the test in different modes, but you must also disable the smooth gradient filter on the tv, otherwise that could fool you. https://drive.google.com/file/d/0B68...UwTFdTNFE/view The Gradient performance is a combination of many factors, and it may be difficult to tease out what's not right, but this should get you started.
__________________
Ghetto | 2500k 5Ghz Last edited by tp4tissue; 23rd November 2020 at 01:48. |
23rd November 2020, 01:09 | #60754 | Link |
Registered User
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
|
The video driver also dithers. I don't know about AMD, but Nvidia dithers everything except RGB full.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex |
23rd November 2020, 01:53 | #60758 | Link | |
Registered User
Join Date: May 2013
Posts: 712
|
Quote:
@ Asmodian, do you know how madvr works with the primaries during its Tonemapping ? Does it apply a transfer function to retarget ? For example, the container is rec2020, the primary reported by the video is rec2020, does madvr do gamut compression by bending the axis, OR does it just send out the vanilla values along an assumed straight line towards the primaries. The thing that confuses me is that IF it doesn't bend the axis, then it MUST require a rec2020 lut no ? without which it would send colors towards the rec709 primaries or p3 primaries of the display device, and that'd just be wrong. red and blue might look pretty close, but the green and magenta would be pretty off.
__________________
Ghetto | 2500k 5Ghz Last edited by tp4tissue; 23rd November 2020 at 02:07. |
|
23rd November 2020, 02:04 | #60759 | Link | |
Registered User
Join Date: Oct 2008
Posts: 168
|
Quote:
So I would like to do any testing with the settings I will be using on a daily basis. Also, my system normally boots to 23.976 fps, and MadVR changes that refresh as needed. |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|