Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 23rd November 2019, 09:02   #57841  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 63
Quote:
Originally Posted by Asmodian View Post
Using the default HDMI input the TV will convert to YCbCr 4:2:2 internally. This blurs the color information horizontally. Both color planes are resampled to 1920x2160 by the TV. This does eliminate banding with 8 or 10 bit RGB input but the image is blurred. I hope you are using very low power chroma scaling options because anything better is pointless.







No issues except being totally pointless, why set Nvidia drivers to anything but 8 bit? There is absolutely zero improvement in quality on an LG 7/8/9 when sending it 10 bit instead of 8 bit when using any HDMI option, and 10 bit is worse quality when using PC or Game mode.



Ideally you want to send an LG C9 8 bit YCbCr limited range using the PC or game HDMI mode. Anything else either has banding or blurs the chroma. If you set the drivers to YCbCr limited set madVr to full range output.

gradient-perceptual-v2.1 24fps.mkv

gradient-perceptual-colored-v2.1 24fps.mkv



I know 10 bit sounds cool but it really is pointless at best on an LG OLED. I will be using 8 bit YCbCr limited range 3940x2160 @ 120 Hz on my LG C9 when I get an HDMI 2.1 GPU unless I get a new TV that is not so bad with RGB full input.

Where did you get the information about the tv internally doing Ycbcr 422? I thought in Windows you always want to use rgb because windows will just convert ycbcr to rgb?

Also, why is my Apple TV able to do 10 bit without banding, but my windows pc can only send it 8 bit dithered or else I get banding?

Finally, why would I set madvr to full if I’m using limited and have my TV set to low hdmi black level?

I’ve seen a lot of conflicting information out there. It’s hard to know what to believe.
tyguy is offline   Reply With Quote
Old 23rd November 2019, 11:09   #57842  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by tyguy View Post
Where did you get the information about the tv internally doing Ycbcr 422? I thought in Windows you always want to use rgb because windows will just convert ycbcr to rgb?
It is easy to test for yourself. With ChromaRes.png you can test which modes use 4:2:2 or 4:4:4. View this image at 100% scaling, i.e. in a window on a 4K screen, and switch between hdmi and pc modes. With the gradient test patterns I linked above you can verify all of what I have said.

Windows will always render in 8 bit RGB but if you set the GPU to YCbCr it converts everything to YCbCr. madVR always outputs RGB too.

Quote:
Originally Posted by tyguy View Post
Also, why is my Apple TV able to do 10 bit without banding, but my windows pc can only send it 8 bit dithered or else I get banding?
Because you have your HDMI port in the default mode, also the Apple TV is sending subsampled limited range YCbCr so this isn't really a problem. However, madVR with 8 bit output and the GPU converting to 8 bit YCbCr 4:4:4 is higher quality.

Quote:
Originally Posted by tyguy View Post
Finally, why would I set madvr to full if I’m using limited and have my TV set to low hdmi black level?
Because limited in the GPU drivers means "convert the full range RGB into limited YCbCr." If you set madVR to limited range too you get double limited because the GPU always converts, it has no way to know madVR is sending limited range.

The reason madVR has a setting for limited range is because if you only care about madVR and don't mind if everything else in Windows is wrong you can set madVR to limited and the GPU to full range. This results in the same image but without the GPU converting ranges. I care about everything else too, and the drivers still need to convert the RGB to YCbCr anyway, so I use limited range in the GPU and full in madVR.

Also, what is ideal in principle and what is ideal in real life with an LG 2019 OLED is not the same thing.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 23rd November 2019, 11:43   #57843  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 63
madVR - high quality video renderer (GPU assisted)

Quote:
Originally Posted by Asmodian View Post
It is easy to test for yourself. With ChromaRes.png you can test which modes use 4:2:2 or 4:4:4. View this image at 100% scaling, i.e. in a window on a 4K screen, and switch between hdmi and pc modes. With the gradient test patterns I linked above you can verify all of what I have said.

Windows will always render in 8 bit RGB but if you set the GPU to YCbCr it converts everything to YCbCr. madVR always outputs RGB too.



Because you have your HDMI port in the default mode, also the Apple TV is sending subsampled limited range YCbCr so this isn't really a problem. However, madVR with 8 bit output and the GPU converting to 8 bit YCbCr 4:4:4 is higher quality.



Because limited in the GPU drivers means "convert the full range RGB into limited YCbCr." If you set madVR to limited range too you get double limited because the GPU always converts, it has no way to know madVR is sending limited range.

The reason madVR has a setting for limited range is because if you only care about madVR and don't mind if everything else in Windows is wrong you can set madVR to limited and the GPU to full range. This results in the same image but without the GPU converting ranges. I care about everything else too, and the drivers still need to convert the RGB to YCbCr anyway, so I use limited range in the GPU and full in madVR.

Also, what is ideal in principle and what is ideal in real life with an LG 2019 OLED is not the same thing.
I actually don’t have my Apple set in default mode. It’s hooked into my receiver which I have labeled “home theatre” in the input section. Maybe that’s the same as default hdmi, but I can always just label that game console or pc as well and see if I can notice banding.

I don’t think you can get wide color gamut from a console or Apple TV without sending the tv 10 bit because they don’t do dithering.

If Windows is converting ycbcr to rgb and madvr only outputs rgb....Why not just use rgb 8 bit limited instead of ycbcr 422?

Last edited by tyguy; 23rd November 2019 at 11:47.
tyguy is offline   Reply With Quote
Old 23rd November 2019, 12:06   #57844  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Because the LG's processing of RGB input is of lower quality.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 23rd November 2019, 13:32   #57845  |  Link
DMU
Registered User
 
Join Date: Dec 2018
Posts: 207
Quote:
Originally Posted by el Filou View Post
Because the LG's processing of RGB input is of lower quality.
@Asmodian
And the output at 60Hz@RGB@PCMode does not solve the issue's on LG OLED?
__________________
R3 3200G / Vega8 / Samsung UE40NU7100
Win11Pro 21H2 / 4K RGB 59Hz / AMD last driver
MPC-HC 1.9.17 / madVR 0.92.17 / FSW / SM / 8bit

Last edited by DMU; 23rd November 2019 at 13:38.
DMU is offline   Reply With Quote
Old 23rd November 2019, 16:42   #57846  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
Quote:
Originally Posted by Asmodian View Post
It is easy to test for yourself.

hi, this issue with LG OLEDS, its quite confusing,, there is a lot to take on here and some conflicting information to what ive been given in the past. Firstl, does it extend to older models, i have gen 1.0 EF950v, i've never noticed any banding to be honest, my current setup is as follows:

HDMI 1 selected, lleft it named HDMI 1
Expert 1 profile set to HIGH colour space
AMD RX 580 running at 4:4:4 FULL RGB set to 8 bit in graphica settings
MADVR set to 0-255

everything looks fine to me, however I do have an issue if I use the actual "PC" HDMI input setting, SDR is fine but no matter what I do HDR is massively washed out and gamma is all messed up, nothing i do corrects this, I dont use PC mode anyway though as I use a tiny bit of smooth motion which greatly improves bright panning shots without any artefact or soap opera effects.

Are you suggesting I should be runniing my panel in PC mode and at YBCR limted mode, what should I actually see on that chroma res pattern, I see 442 clearly but I can also see 444 albeit fiently.

I dont see any banding in either of those movies, gradients and pretty smooth.
__________________
LG OLED EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 RX 5700 - https://www.videohelp.com/software/madVR/old-versions
mclingo is offline   Reply With Quote
Old 23rd November 2019, 17:38   #57847  |  Link
Stef2
Registered User
 
Join Date: Jan 2018
Posts: 3
Anamorphic stretch in madvr

I would like to hear from madvr users with a projector and an anamorphic lens.

I just bought such a lens and when I enable the anamorphic option in madvr (4/3 vertical stretch in my case) the rendering time goes up quite a lot, increasing from 36ms average to 55+ms average after the stretch...

Of course, that makes my 4K HDR 24p movie unwatchable

The only way I gan get the rendering time back to below 40ms is by disabling completely any HDR processing in madvr, but that deteriorates the projected image a lot, of course. Decreasing luma and chroma upsampling quality by a lot is not enough. Decreasing dithering quality is not enough. Enabling every "trade quality for performance" option over all that that is not enough...

My GPU is a GTX 1070. I do not mind upgrading it to a RTX 2080 if this is what I will need.

Is there any anamorphic lens user around here? what do you observe when enabling this option? Any RTX 2080 user can help me by observing the difference in rendering time between no stretch and anamorphic stretch enabled in madvr.

Any input is welcome!
Thank you.
Stef2 is offline   Reply With Quote
Old 23rd November 2019, 22:19   #57848  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by tyguy View Post
I actually don’t have my Apple set in default mode. It’s hooked into my receiver which I have labeled “home theatre” in the input section.
Yes, Home Theatre is the same as the default, it subsamples the chroma.

Quote:
Originally Posted by tyguy View Post
If Windows is converting ycbcr to rgb and madvr only outputs rgb....Why not just use rgb 8 bit limited instead of ycbcr 422?
Windows is not converting YCbCr to RGB.

The GPU driver is converting the RGB from both Windows and madVR to YCbCr. Do NOT use YCbCr 422! That would be subsampling the chroma in the GPU driver instead of the TV.

If you are asking why the TV uses YCbCr 422 internally it is because TV manufactures are too cheap to get decent video processing chips. YCbCr 422 only takes 2/3 of the bandwidth of YCbCr 444 so they do not need as capable of hardware.

In the Nvidia drivers I cannot set limited range RGB anymore so I did not test limited range RGB (that isn't really a standard either). Better to use YCbCr as always limited and RGB as always full range. The TV is expecting YCbCr limited range input because that is what Apple TV, bluray players, etc. will send it. LG got it working reasonably well for that but they seem to not care about the quality of full range RGB input.

Quote:
Originally Posted by DMU View Post
@Asmodian
And the output at 60Hz@RGB@PCMode does not solve the issue's on LG OLED?
The refresh rate does not change the banding with RGB input in PC/Game mode in anyway but it does solve the issue with judder with any non-60Hz input while in PC mode. I always use 60 Hz with smooth motion now. Once in PC mode (so we get full resolution color) these TVs are pretty finicky and only seem to handle 8 bit YCbCr 444 limited range 60 Hz input well.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 23rd November 2019, 22:31   #57849  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by huhn View Post
the problem is someone has to write the code for this and how do you even say the GPU this frame 0.5 ms later plz?
Yes, of course it is work for madshi that does not improve the quality in any ideal scenarios. This is the best reason for it not to exist.

But it isn't that hard to decide what to do (coding it is another issue), simply wait 0.5 ms and then present the frame. I probably cannot tell if a frame is 0.5 ms early or late anyway and the even smaller inaccuracies due to Windows scheduling are even less important. As long as the player is presenting the frames close to when it should audio sync is a non-issue. If this would be better than smooth motion at 60 Hz I don't know, I am pretty happy with smooth motion already so I am not sure it is worth the effort. However, as it is now I need to manually turn off VRR when I switch from gaming to watching video.

Quote:
Originally Posted by mclingo View Post
HDMI 1 selected, lleft it named HDMI 1
Nothing I said applies if you let the TV subsample chroma (e.g. do not use the PC HDMI setting).

Also, use the banding test patterns to judge banding. I do not see obvious banding with normal content except very rarely. I also don't know about models that old, I have only tested the C7 and C9 myself.

HDR tone mapping is obviously worse quality on my display when in PC/Game mode, I really wish LG did not subsample chroma in their video processor. According to my calibration software the gamut is much smaller, which results in less saturated/washed out video but the gamma is still reasonable, not as good, but reasonable. I do switch to Home Theater when watching HDR (rare for me).
__________________
madVR options explained

Last edited by Asmodian; 23rd November 2019 at 22:52.
Asmodian is offline   Reply With Quote
Old 23rd November 2019, 23:11   #57850  |  Link
DMU
Registered User
 
Join Date: Dec 2018
Posts: 207
Quote:
Originally Posted by Asmodian View Post
The refresh rate does not change the banding with RGB input in PC/Game mode in anyway
On my Samsung I notice only 1 issue in PC mode: if the frequency of the HDMI input is not 29/30/59/60Hz, then the TV switches to 422 mode.
__________________
R3 3200G / Vega8 / Samsung UE40NU7100
Win11Pro 21H2 / 4K RGB 59Hz / AMD last driver
MPC-HC 1.9.17 / madVR 0.92.17 / FSW / SM / 8bit
DMU is offline   Reply With Quote
Old 23rd November 2019, 23:16   #57851  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Asmodian View Post
Yes, of course it is work for madshi that does not improve the quality in any ideal scenarios. This is the best reason for it not to exist.
the point is it even possible?
it you want to do that using chrono you have to write a new rendering path using that...
Quote:
However, as it is now I need to manually turn off VRR when I switch from gaming to watching video.
that's odd this is happening fully automatic on my system with AMD and nvidia.

alternatively use the manage 3D settings to automate it.
huhn is offline   Reply With Quote
Old 23rd November 2019, 23:44   #57852  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by huhn View Post
the point is it even possible?
Of course it is possible, all you need to do is not present it until you want it displayed, like any game with a frame rate limit would. This is why your timing wouldn't be that accurate but we don't know that it would be bad enough to be annoying. A lot of people seem OK with frames being displayed for +/- 8 ms (3:2 judder) and it would likely be possible to get it much better than that.

Quote:
Originally Posted by huhn View Post
that's odd this is happening fully automatic on my system with AMD and nvidia.
Fully automatic disable? How does it know to turn off?

Quote:
Originally Posted by huhn View Post
alternatively use the manage 3D settings to automate it.
I tested this a bunch in the past but it does work now!

At least with 441.20 setting Zoom Player to a fixed refresh rate works perfectly, my player stays at a solid 60 Hz instead of drifting about. Thanks!

Quote:
Originally Posted by DMU View Post
On my Samsung I notice only 1 issue in PC mode: if the frequency of the HDMI input is not 29/30/59/60Hz, then the TV switches to 422 mode.
What?! Why?!? The way TVs handle various inputs is totally bizarre. I wonder how the engineers make decisions.
__________________
madVR options explained

Last edited by Asmodian; 24th November 2019 at 02:35.
Asmodian is offline   Reply With Quote
Old 24th November 2019, 02:35   #57853  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by Asmodian View Post
Of course it is possible, all you need to do is not present it until you want it displayed, like any game with a frame rate limit would. This is why your timing wouldn't be that accurate but we don't know that it would be bad enough to be annoying. A lot of people seem OK with frames being displayed for +/- 16 ms (3:2 judder) and it would likely be possible to get it much better than that.
you say that so easy. games are designed differently with bad design choices like physics been coupled with frame rate.
the needs for frame rate capper which in turn kind of do the same.

Quote:
Fully automatic disable? How does it know to turn off?
it's supposed to detect video playback and blocking it looks like your player didn't make it on that list.
i never even got free sync to trigger with mpc-hc.
disabling free sync is planned or even already part of the nvidia driver for fixed frame rate games.
Quote:
What?! Why?!? The way TVs handle various inputs is totally bizarre. I wonder how the engineers make decisions.
that's an upgrade in the past it was only 60 hz followed by the addition of 30 hz. that was well known for samsung TVs.
only sony and phillips are well known to support PC mode at all refreshrates without dumb things like forcing 3:2 judder that's a panasonic classic.
huhn is offline   Reply With Quote
Old 24th November 2019, 02:57   #57854  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by huhn View Post
you say that so easy. games are designed differently with bad design choices like physics been coupled with frame rate.
I only say that I think it would still be useful/interesting, not that it is necessarily worth it for madshi to implement. He has definitely said he is not interested in trying to do the presentation timing in madVR, but might use a D3D API if available.

Quote:
Originally Posted by huhn View Post
disabling free sync is planned or even already part of the nvidia driver for fixed frame rate games.
That makes sense, Zoom Player probably has a pretty small user base today. People don't like paying for a player when the open source alternatives are so good. I am just happy manually assigning it works. Is that because of a newer driver or software VRR instead of the Gsync module? I didn't test it with the TV before, only on monitors with a module.

Quote:
Originally Posted by huhn View Post
only sony and phillips are well known to support PC mode at all refreshrates without dumb things like forcing 3:2 judder that's a panasonic classic.
It is interesting that Sony seems to generally do a good job at video processing, historically as well. I believe their engineers have a different viewpoint or something.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th November 2019, 04:02   #57855  |  Link
tyguy
Registered User
 
tyguy's Avatar
 
Join Date: Oct 2019
Posts: 63
Quote:
Originally Posted by Asmodian View Post
Yes, Home Theatre is the same as the default, it subsamples the chroma.







Windows is not converting YCbCr to RGB.



The GPU driver is converting the RGB from both Windows and madVR to YCbCr. Do NOT use YCbCr 422! That would be subsampling the chroma in the GPU driver instead of the TV.



f.
So if pc mode is full Chroma 444, and everything but game mode and pc is 422, then what pixel format is game mode?

I just see tons of conflicting information. Like this here:

“Only use RGB 8-bit for everything on a PC, including HDR games and movies, even when connected to a HDR TV over HDMI. The GPU does dithering for 10-bit content and there will be no banding.

Almost everything you read on this topic is misinformation.”


https://www.google.com/amp/s/amp.red...12_bpclimited/

“A 10-bit signal to the display is only required when the source doesn't perform dithering (PS4 , Blu-ray player, etc.). If the PS4 did dithering, it could run RGB 8-bit 60 Hz instead of subsampling at YCbCr420 10-bit 60 Hz because there isn't enough bandwidth for RGB 10-bit 60 Hz over HDMI 2.0.

If the display is 8-bit + FRC, the 10-bit signal is dithered internally by the display anyway. A true 10-bit panel is pointless since the quantization noise on 8-bit + dithering is invisible.

On Windows, HDR apps render to a 10-bit surface and the GPU does dithering automatically if the signal is 8-bit. So you should just use 8-bit RGB for maximum quality.”
tyguy is offline   Reply With Quote
Old 24th November 2019, 05:19   #57856  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
Quote:
Originally Posted by tyguy View Post
So if pc mode is full Chroma 444, and everything but game mode and pc is 422, then what pixel format is game mode?
a game mode should be 4:4:4 if you ask me but that doesn't mean it is may just be low latency. sony doesn't even have a PC mode they just have game mode or graphic both are 4:4:4 with all the processing you want to ruin the image.

Quote:
I just see tons of conflicting information. Like this here:
Quote:
Limited. All movies and TV shows are transmitted/streamed in limited color range (including Blu-ray’s)
this didn't even got corrected this obvious flaw. reddit there are simply not enough people that understand this topic to run after those that spread blind misinformation which is only wrong in this context.

sending limited or full range RGB should only depends on if the TV excepts full range or limited range. if the end device can be set up to except full range error free. full range is better because that is what the windows desktop runs at that's the output of the video renderer else limited range RGB in the GPU output would never ever be correct.

rendering 10 bit on a WFS surface with the nvidia GPU at 8 bit has a history of terrible banding. this was such a bad setup that i didn't recheck it if this issue is still present. but i guess this could affect other software then madVR.
edit: nvidia 441.08 win 1809 issue still present. test with 1909 tomorrow. edit2: same issue.

Last edited by huhn; 24th November 2019 at 17:56.
huhn is offline   Reply With Quote
Old 24th November 2019, 08:15   #57857  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Quote:
Originally Posted by tyguy View Post
“Only use RGB 8-bit for everything on a PC, including HDR games and movies, even when connected to a HDR TV over HDMI. The GPU does dithering for 10-bit content and there will be no banding.
Is this on a recent LG OLED? I would agree with that advice unless using a display that has banding when sent full range RGB but doesn't when sent limited range YCbCr.

I would read that as saying 10 bit is unimportant, arguing against sending YCbCr 422 10 bit instead of 444 or RGB 8 bit, which some think sounds good because they know what 10 bit means but do not know what YCbCr 422 means.

The topic is super complicated once you bring the failings of individual displays into it. It is important to differentiate advice about a particular display from general advice.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 24th November 2019, 17:12   #57858  |  Link
Stef2
Registered User
 
Join Date: Jan 2018
Posts: 3
Hi. Could someone with a RTX 2080 card (no matter the version) and a projector (no matter which one) do this quick test for me: compare the rendering time of your usual, every day settings for 4K UHD video with the anamorphic stretch disabled vs enabled (no need for an anamorphic lens in place). I would like to know how much of a jump does the anamorphic stretch causes in the rendering time, everything else untouched.

Thank you!
Stef
Stef2 is offline   Reply With Quote
Old 24th November 2019, 17:21   #57859  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
this depends on your up/downscale settings so it ranges from close to nothing to impossible for a 2080 ti.
huhn is offline   Reply With Quote
Old 24th November 2019, 23:05   #57860  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 652
Quote:
Originally Posted by GTPVHD View Post
I'd wait and see if anyone will test the GTX 1650 Super launching today, it's cheaper than GTX 1060's original US$249 MSRP at US$159 and Turing is certainly better than Pascal at compute workloads.
The complete silence from Nvidia on HDMI 2.1 is quite unnerving.
ashlar42 is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 16:10.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.