Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
![]() |
#57841 | Link | |||
Registered User
Join Date: Oct 2012
Posts: 6,867
|
Quote:
Quote:
and g-sync for video is very complicated it's not a game where A/V sync doesn't exist. audio that "need" sync in games is usually only a couple of secs so it doesn't matter and the sync of BGM is irrelevant. but for movie playback you have to sync an video to an hour or longer audio stream it not build for that. to properly use that madVR needs to measure the difference between audio and video clock fix this using the system clock. yeah lot of fun. Quote:
|
|||
![]() |
![]() |
![]() |
#57842 | Link |
Registered User
Join Date: Oct 2019
Posts: 63
|
I read somewhere if you set your xbox to 4k 60 it will still do LFC on 30 fps titles that dip below 30 fps since 25-30 doubles into samsungs vrr range. Well, im assuming the 2020 and beyond lineup of tvs will start to feature hdmi 2.1 from every manufacturer. So it wont just benefit lg oled.
Last edited by tyguy; 22nd November 2019 at 20:34. |
![]() |
![]() |
![]() |
#57846 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,137
|
Quote:
I still think VRR support in madVR would be useful. Playback wouldn't be as good as a tuned fixed refresh rate but we wouldn't need to worry about refresh rates at all. madshi is less interested simply because it is going to be worse. I think the minor judder from imperfect presentation timing would be fine, way better than 23fps at 60 Hz, and audio sync is pretty easy if having all frames presented for the exact same amount of time is not considered too important. Simply adjust the frame times a tiny bit as you go to maintain sync. madVR is all about quality so spending a bunch of work to get VRR support for gaming monitors to support a slightly worse playback mode was not reasonable. However, now that TVs are getting VRR too it might be more reasonable to look into it? Probably not. ![]()
__________________
madVR options explained |
|
![]() |
![]() |
![]() |
#57848 | Link | |
Registered User
Join Date: Oct 2019
Posts: 63
|
Quote:
I dont label my hdmi input anything. I just keep it to default hdmi 1. With 8 bit I haven't noticed any banding. I will be getting a new card when 2.1 cards come out though. So ill be running my desktop at 4k 120 hz rgb full 10/12 bit. Would there be an issues if I set madvr to 8 bit, but im using 10 or 12 bit with my nvidia drivers? |
|
![]() |
![]() |
![]() |
#57849 | Link | ||
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,137
|
Quote:
Quote:
Ideally you want to send an LG C9 8 bit YCbCr limited range using the PC or game HDMI mode. Anything else either has banding or blurs the chroma. If you set the drivers to YCbCr limited set madVr to full range output. gradient-perceptual-v2.1 24fps.mkv gradient-perceptual-colored-v2.1 24fps.mkv I know 10 bit sounds cool but it really is pointless at best on an LG OLED. I will be using 8 bit YCbCr limited range 3840x2160 @ 120 Hz on my LG C9 when I get an HDMI 2.1 GPU unless I get a new TV that is not so bad with RGB full input. ![]()
__________________
madVR options explained Last edited by Asmodian; 23rd November 2019 at 11:21. |
||
![]() |
![]() |
![]() |
#57852 | Link | |
Registered User
Join Date: Oct 2019
Posts: 63
|
Quote:
Where did you get the information about the tv internally doing Ycbcr 422? I thought in Windows you always want to use rgb because windows will just convert ycbcr to rgb? Also, why is my Apple TV able to do 10 bit without banding, but my windows pc can only send it 8 bit dithered or else I get banding? Finally, why would I set madvr to full if I’m using limited and have my TV set to low hdmi black level? I’ve seen a lot of conflicting information out there. It’s hard to know what to believe. |
|
![]() |
![]() |
![]() |
#57853 | Link | |||
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,137
|
Quote:
![]() Windows will always render in 8 bit RGB but if you set the GPU to YCbCr it converts everything to YCbCr. madVR always outputs RGB too. Quote:
Quote:
The reason madVR has a setting for limited range is because if you only care about madVR and don't mind if everything else in Windows is wrong you can set madVR to limited and the GPU to full range. This results in the same image but without the GPU converting ranges. I care about everything else too, and the drivers still need to convert the RGB to YCbCr anyway, so I use limited range in the GPU and full in madVR. Also, what is ideal in principle and what is ideal in real life with an LG 2019 OLED is not the same thing. ![]()
__________________
madVR options explained |
|||
![]() |
![]() |
![]() |
#57854 | Link | |
Registered User
Join Date: Oct 2019
Posts: 63
|
madVR - high quality video renderer (GPU assisted)
Quote:
I don’t think you can get wide color gamut from a console or Apple TV without sending the tv 10 bit because they don’t do dithering. If Windows is converting ycbcr to rgb and madvr only outputs rgb....Why not just use rgb 8 bit limited instead of ycbcr 422? Last edited by tyguy; 23rd November 2019 at 11:47. |
|
![]() |
![]() |
![]() |
#57855 | Link |
Registered User
Join Date: Oct 2016
Posts: 727
|
Because the LG's processing of RGB input is of lower quality.
__________________
HTPC: Windows 10 1909, MediaPortal 1, LAV Filters, ReClock, madVR. DVB-C TV, Panasonic GT60, 6.0 speakers Denon 2310, Core 2 Duo E7400, GeForce 1050 Ti |
![]() |
![]() |
![]() |
#57856 | Link |
Registered User
Join Date: Dec 2018
Posts: 170
|
@Asmodian
And the output at 60Hz@RGB@PCMode does not solve the issue's on LG OLED?
__________________
R3 2200G / Vega8 / Samsung UE40NU7100 Win10Pro 1909 / 4K RGB 59Hz / AMD 20.1.3 MPC-HC 1.9.1 / madVR 0.92.17 / FSW / 10bit@59Hz Last edited by DMU; 23rd November 2019 at 13:38. |
![]() |
![]() |
![]() |
#57857 | Link |
Registered User
Join Date: Aug 2016
Posts: 1,228
|
hi, this issue with LG OLEDS, its quite confusing,, there is a lot to take on here and some conflicting information to what ive been given in the past. Firstl, does it extend to older models, i have gen 1.0 EF950v, i've never noticed any banding to be honest, my current setup is as follows: HDMI 1 selected, lleft it named HDMI 1 Expert 1 profile set to HIGH colour space AMD RX 580 running at 4:4:4 FULL RGB set to 8 bit in graphica settings MADVR set to 0-255 everything looks fine to me, however I do have an issue if I use the actual "PC" HDMI input setting, SDR is fine but no matter what I do HDR is massively washed out and gamma is all messed up, nothing i do corrects this, I dont use PC mode anyway though as I use a tiny bit of smooth motion which greatly improves bright panning shots without any artefact or soap opera effects. Are you suggesting I should be runniing my panel in PC mode and at YBCR limted mode, what should I actually see on that chroma res pattern, I see 442 clearly but I can also see 444 albeit fiently. I dont see any banding in either of those movies, gradients and pretty smooth.
__________________
OLED 4k HDR EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 1909 444 RGB -MD RX 5700 8GB 20.1.3 KODI DS - MAD/LAV 92.17+ 113 beta - 0.74.1 - 3D MVC / FSE:on / MADVR 10bit |
![]() |
![]() |
![]() |
#57858 | Link |
Registered User
Join Date: Jan 2018
Posts: 3
|
Anamorphic stretch in madvr
I would like to hear from madvr users with a projector and an anamorphic lens.
I just bought such a lens and when I enable the anamorphic option in madvr (4/3 vertical stretch in my case) the rendering time goes up quite a lot, increasing from 36ms average to 55+ms average after the stretch... Of course, that makes my 4K HDR 24p movie unwatchable ![]() The only way I gan get the rendering time back to below 40ms is by disabling completely any HDR processing in madvr, but that deteriorates the projected image a lot, of course. Decreasing luma and chroma upsampling quality by a lot is not enough. Decreasing dithering quality is not enough. Enabling every "trade quality for performance" option over all that that is not enough... My GPU is a GTX 1070. I do not mind upgrading it to a RTX 2080 if this is what I will need. Is there any anamorphic lens user around here? what do you observe when enabling this option? Any RTX 2080 user can help me by observing the difference in rendering time between no stretch and anamorphic stretch enabled in madvr. Any input is welcome! Thank you. |
![]() |
![]() |
![]() |
#57859 | Link | |||
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,137
|
Quote:
Quote:
![]() The GPU driver is converting the RGB from both Windows and madVR to YCbCr. Do NOT use YCbCr 422! That would be subsampling the chroma in the GPU driver instead of the TV. ![]() If you are asking why the TV uses YCbCr 422 internally it is because TV manufactures are too cheap to get decent video processing chips. YCbCr 422 only takes 2/3 of the bandwidth of YCbCr 444 so they do not need as capable of hardware. In the Nvidia drivers I cannot set limited range RGB anymore so I did not test limited range RGB (that isn't really a standard either). Better to use YCbCr as always limited and RGB as always full range. The TV is expecting YCbCr limited range input because that is what Apple TV, bluray players, etc. will send it. LG got it working reasonably well for that but they seem to not care about the quality of full range RGB input. Quote:
![]()
__________________
madVR options explained |
|||
![]() |
![]() |
![]() |
#57860 | Link | |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,137
|
Quote:
But it isn't that hard to decide what to do (coding it is another issue), simply wait 0.5 ms and then present the frame. I probably cannot tell if a frame is 0.5 ms early or late anyway and the even smaller inaccuracies due to Windows scheduling are even less important. As long as the player is presenting the frames close to when it should audio sync is a non-issue. If this would be better than smooth motion at 60 Hz I don't know, I am pretty happy with smooth motion already so I am not sure it is worth the effort. However, as it is now I need to manually turn off VRR when I switch from gaming to watching video. ![]() Nothing I said applies if you let the TV subsample chroma (e.g. do not use the PC HDMI setting). Also, use the banding test patterns to judge banding. I do not see obvious banding with normal content except very rarely. I also don't know about models that old, I have only tested the C7 and C9 myself. HDR tone mapping is obviously worse quality on my display when in PC/Game mode, I really wish LG did not subsample chroma in their video processor. According to my calibration software the gamut is much smaller, which results in less saturated/washed out video but the gamma is still reasonable, not as good, but reasonable. I do switch to Home Theater when watching HDR (rare for me).
__________________
madVR options explained Last edited by Asmodian; 23rd November 2019 at 22:52. |
|
![]() |
![]() |
![]() |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|