Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 27th September 2020, 20:33   #60221  |  Link
CZ Eddie
Registered User
 
Join Date: Aug 2020
Posts: 54
Quote:
Originally Posted by CZ Eddie View Post
Can anyone help for my broadcast TV setup?

Which is the best Lav video setting for watching nothing but 720P and 1080i broadcast TV on a 1080P TV with an older GTX 1060 GPU on an 8GB 3Ghz quad-core system?
Hmm, I just ran through most of the Lav decoder options while watching FOX football via JRiver.
And none seemed to have any affect on my MadVR rendering ms.
Didn't really notice any obvious video quality differences either.
Though FOX 720P OTA football games is not a very good quality source to begin with.

The artifacts on my 2019 $700 cheapo LG 75" 4K football-watching TV are pretty crazy and I'm on a mission to clean them up if possible.
CZ Eddie is offline   Reply With Quote
Old 28th September 2020, 12:29   #60222  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
DXVA2 Native gives blurred chroma on NVIDIA cards, so to get the best quality other decoding options are better.
If you only watch HD I would use DXVA2 Copyback (it gives deinterlacing artefacts with 576i on my setup, but not with 1080i), or even software. Try to compare both while watching CPU and GPU clocks and %age utilization, software will use the CPU more but H.264 L4.0 is really easy to decode on a modern CPU, DXVA Copyback will use the GPU slightly more with madVR but with 720p/1080i it should be hardly noticeable.

You say you're watching on a 1080p TV but then you talk about a 4K 75" one? Which is it?

madVR has "reduce compression artifacts" (in processing => artifact removal), which is self-explanatory. If you use it in combination with NGU sharp you also gain some processing time.
I've found that with HDTV, "reduce banding" is useful too, I use it at low setting, and previously when my cable provider was stuffing too many channels on each mux sometimes I had to push it to medium (I mapped it to a keyboard shortcut and only increase it when I notice too obvious banding in the content I'm watching).
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 28th September 2020, 16:04   #60223  |  Link
Alexkral
Registered User
 
Join Date: Oct 2018
Posts: 319
Quote:
Originally Posted by el Filou View Post
If you only watch HD I would use DXVA2 Copyback (it gives deinterlacing artefacts with 576i on my setup, but not with 1080i), or even software.
Is there any reason why you don't even consider CUVID?
__________________
AviSynth AiUpscale
Alexkral is offline   Reply With Quote
Old 28th September 2020, 16:30   #60224  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
currently doesn't add anything over DXVA2 copyback and forces your GPU to maximum performance pstate.

there was even a consideration to remove it a long time ago. a new mode was shared as an idea no clue what is currently planned so don't take this as up to date.
huhn is offline   Reply With Quote
Old 28th September 2020, 18:10   #60225  |  Link
CZ Eddie
Registered User
 
Join Date: Aug 2020
Posts: 54
Quote:
Originally Posted by el Filou View Post
If you only watch HD I would use DXVA2 Copyback (it gives deinterlacing artefacts with 576i on my setup, but not with 1080i), or even software.
Hmm, okay thanks. I will focus my attention on testing those two specifically.

Quote:
Originally Posted by el Filou View Post
Try to compare both while watching CPU and GPU clocks and %age utilization, software will use the CPU more but H.264 L4.0 is really easy to decode on a modern CPU, DXVA Copyback will use the GPU slightly more with madVR but with 720p/1080i it should be hardly noticeable.
I was only looking at rendering ms but this is a good suggestion.
Huhn recommended GPU-Z in another thread to somebody. Mabye I'll try that software.


Quote:
Originally Posted by el Filou View Post
You say you're watching on a 1080p TV but then you talk about a 4K 75" one? Which is it?
The 4K TV is my primary TV but I am running my HTPC at 1080P, not 2160P.
I'd rather spend all my processing power on cleaning up the crappy OTA image than upconverting to 4K.
Sometimes OTA looks really good and clean, but much of the time it's full of various artifacts.

Quote:
Originally Posted by el Filou View Post
madVR has "reduce compression artifacts" (in processing => artifact removal), which is self-explanatory.
Yep, was hoping for some guidance on those options (and where they can help) and anything else that might help.

Quote:
Originally Posted by el Filou View Post
If you use it in combination with NGU sharp you also gain some processing time.
What does "gain some processing time" mean? Are you saying it consumes more GPU processing power?

Quote:
Originally Posted by el Filou View Post
I've found that with HDTV, "reduce banding" is useful too, I use it at low setting, and previously when my cable provider was stuffing too many channels on each mux sometimes I had to push it to medium
Thanks for the tip!

Quote:
Originally Posted by el Filou View Post
I mapped it to a keyboard shortcut and only increase it when I notice too obvious banding in the content I'm watching.
I really need to sit down and configure keyboard shortcuts to my remote control.

Last edited by CZ Eddie; 28th September 2020 at 18:13.
CZ Eddie is offline   Reply With Quote
Old 28th September 2020, 22:10   #60226  |  Link
Ilovetv9
Registered User
 
Join Date: Aug 2019
Posts: 14
[QUOTE=huhn;1924256]the trick is simple you make sure that lavfilter settings are left at default so it can say "hey look madVR this file is interlaced"
this works as long as d3d11 native is not used madVR will fully automatically deint every video that is flagged as interlaced.

For best possible video quality when using madvr+lav filters, which should I choose of the 5 options in my hardware decoder list?
- nvidia cuvid, intel quicksync, dxva2 copy back, dxva2 native or d3d11

Are you saying madvr cannot automatically deinterlace if I choose d3d11? I thought it's always better to not automatic deinterlace because sometimes videos are flagged incorrectly as interlaced and it's always best to use your own eyes to determine, so I always try to keep the interlaced settings as progressive, unless I know the video needs to be deinterlaced, then I turn on auto, then enable madvr deinterlacer. Am I correct?
Ilovetv9 is offline   Reply With Quote
Old 28th September 2020, 22:35   #60227  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
just leave it at auto and d3d11 will never do deint it can't do it with madVR madshi never implemented it.

you can just toggle deint with control+alt+shift+d no need to change lavfilter settings.

just stick to DXVA2 copyback it really doesn't matter that much.
huhn is offline   Reply With Quote
Old 29th September 2020, 16:52   #60228  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 896
Quote:
Originally Posted by CZ Eddie View Post
Yep, was hoping for some guidance on those options (and where they can help) and anything else that might help.
Can't really help with RCA sorry, as I don't use it on my HTPC with the plasma already masking some artefacts up to a point, and I don't enable it on my desktop for other reasons. I suggest experimenting, like starting with a low level and increasing progressively until you find artefacts don't really bother you anymore.
Quote:
What does "gain some processing time" mean? Are you saying it consumes more GPU processing power?
No, the opposite actually, but only if you also use NGU Sharp at the same time. IIRC, due to the way the two algorithms works madVR can combine steps used in both and it doesn't cost much more processing power to do NGU Sharp + RCA than NGU Sharp used alone.
__________________
HTPC: Windows 10 22H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 536.40
el Filou is offline   Reply With Quote
Old 29th September 2020, 17:06   #60229  |  Link
Damien147
Registered User
 
Join Date: Mar 2011
Posts: 380
I've read that my TV has 8 bit panel but rtings says this:
Quote:
Color Depth 10 bit

The Samsung KU6300 can display our gradient test image fairly well. On our test picture, the gradation is smooth overall in the light shades with some small anomalies in the darker shades, especially in the green color. But it should not be an issue in regular content.

Update 10/26/2016: Our original test was showing 8 bit gradations due to incorrect drivers on our system. After some correction to our test apparatus, we have retested the color depth and found that it is able to display a 10 bit gradient smoothly.
What's the proper bit depth to put in Madvr properties?8 bit or 10 bit?
Damien147 is offline   Reply With Quote
Old 29th September 2020, 17:14   #60230  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
8 bit is safer. Test it, and if you cannot tell the difference use 8 bit. madVR has very good dithering so having it dither to 8 bit is often better than any dithering the panel, display, or GPU might do when sending 10 bit from madVR.

I like these gradient test patterns:
gradient-perceptual-colored-v2.1 24fps.mkv
gradient-perceptual-v2.1 24fps.mkv
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 29th September 2020, 18:17   #60231  |  Link
Damien147
Registered User
 
Join Date: Mar 2011
Posts: 380
It says 10 bit(8 bit+FRC).You made a point and I don't think I need to search more,8 bit it is.Thank you.
Damien147 is offline   Reply With Quote
Old 29th September 2020, 19:08   #60232  |  Link
SirMaster
Registered User
 
Join Date: Feb 2019
Posts: 231
Quote:
Originally Posted by Damien147 View Post
It says 10 bit(8 bit+FRC).You made a point and I don't think I need to search more,8 bit it is.Thank you.
Yeah, FRC is a type a dithering, but I would reckon madVR's error-diffusion dithering is higher quality than your display's FRC algorithm.
SirMaster is offline   Reply With Quote
Old 29th September 2020, 20:08   #60233  |  Link
jandari
Registered User
 
Join Date: Sep 2018
Posts: 9
Im using kodi dsplayer 17.6. and madvr. Last working nvidia driver was 446.14 since then i had a lot of bugs improperly switching hdr on off and with gui. With previous version 456.38 almost all was fixed but issue remains when stopping dsplayer playback and getting back to kodi gui, the gui of kodi remains extremely dark. You have to restart kodi to return to normal brightness(windows gui is normal color, no resolution switch happens). With rolling back to 446.14 driver everything works fine. I think I must be nvidia driver issue.
jandari is offline   Reply With Quote
Old 29th September 2020, 20:43   #60234  |  Link
mclingo
Registered User
 
Join Date: Aug 2016
Posts: 1,348
Quote:
Originally Posted by jandari View Post
Im using kodi dsplayer 17.6. and madvr. Last working nvidia driver was 446.14 since then i had a lot of bugs improperly switching hdr on off and with gui. With previous version 456.38 almost all was fixed but issue remains when stopping dsplayer playback and getting back to kodi gui, the gui of kodi remains extremely dark. You have to restart kodi to return to normal brightness(windows gui is normal color, no resolution switch happens). With rolling back to 446.14 driver everything works fine. I think I must be nvidia driver issue.
https://forum.kodi.tv/showthread.php?tid=351534

have u tried this build of DS?
__________________
LG OLED EF950-YAM RX-V685-RYZEN 3600 - 16GBRAM - WIN10 RX 5700 - https://www.videohelp.com/software/madVR/old-versions
mclingo is offline   Reply With Quote
Old 29th September 2020, 22:17   #60235  |  Link
Damien147
Registered User
 
Join Date: Mar 2011
Posts: 380
Quote:
Originally Posted by SirMaster View Post
Yeah, FRC is a type a dithering, but I would reckon madVR's error-diffusion dithering is higher quality than your display's FRC algorithm.
Yeah,that's what I understood with Asmodian's words and slight googling.Thanks for the confirmation.
Damien147 is offline   Reply With Quote
Old 30th September 2020, 01:22   #60236  |  Link
cosmitz
Registered User
 
Join Date: Jan 2013
Posts: 8
Hey there. So i have had a bit of audio sync issue for forever now, but i've just settled on debugging it tonight since there were /so/ many moving parts, but after testing i've traced it back to madVR exclusive mode. I'm running madVR (latest, upgrading didn't solve it), on MPC-HC, W7, and here's the debug stats imgur.

The issue is that while in exclusive fullscreen mode (which btw, the 3 second delay setting does nothing), i have a measure of audio lag/desync which stays consistent, and is not stuttery or anything, just always a step behind the video. Not really sure how much but feels like 100-200 ms. Also, not using exclusive creates some random video jutter which i'd rather avoid. Any idea where to start looking?

Last edited by cosmitz; 30th September 2020 at 01:34.
cosmitz is offline   Reply With Quote
Old 30th September 2020, 04:25   #60237  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
What display, video card, receiver do you have? I switch between FS exclusive and windowed all the time, and I've never noticed any difference in audio delay.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Old 30th September 2020, 11:25   #60238  |  Link
senzaparole
Registered User
 
Join Date: Jan 2020
Location: Italia
Posts: 55
Hi guys, I use Madvr with MPC-HC on a PC with RX 5700 XT video card.
In the Lav Video I have selected:

- Hardware Decoder to use: D3D11;
- Active Decoder: inactive;
- Active Hardware Acceleration: none;
- Hardware Device to use: Automatic (native).

My doubt is whether it is correct that Active Decoder is set to "inactive" and the Active Hardware Acceleration is set to "none"
__________________
TV:LG B9 65 Amplifier: Denon X4500H Front Speakers: n. 3 XTZ Cinema M6 Subwoofer: XTZ 10.17 EDGE Atmos Speakers:n. 4 XTZ S2 Surround Speakers: n. 2 XTZ Spirit 2 - HTPC: Fractal Meshify C-RX 5700 XT
senzaparole is offline   Reply With Quote
Old 30th September 2020, 14:28   #60239  |  Link
jandari
Registered User
 
Join Date: Sep 2018
Posts: 9
Yes, no diference. Its obviously nvidia driver problem, because it works flawlessly with older version driver. And with previous version there are some fixes for madvr only this bug is left, a lot were fixed that appeared in versions in between.Maybe madshi can contact nvidia.
jandari is offline   Reply With Quote
Old 30th September 2020, 17:46   #60240  |  Link
VBB
Registered User
 
VBB's Avatar
 
Join Date: May 2016
Location: Long Beach, CA, USA
Posts: 620
@senzaparole - Yes, this is correct when looking at it while not playing anything. Your settings are OK.
__________________
Henry | LG OLED65C7P | Denon AVR-X3500H | ELAC Uni-Fi x7 | ELAC Debut 2.0 SUB3030 x2 | NVIDIA SHIELD TV Pro 2019 | Plex
VBB is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 22:04.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.