Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th May 2022, 15:39   #63121  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 156
Quote:
Originally Posted by MadVR's tips.txt file

You can force madVR to apply specific properties by adding tags to your filenames.

deint=On|Off|Video|Film|ivtc
Is there any difference between "Film" and "ivtc"? I can't seem to tell any difference but just wanted to make sure. Asmodian's "madVR options explained" doesn't mention any difference, and neither does JRiver's wiki.

On another note I am a bit disappointed with MadVR's "Video" mode deinterlacing -- it seems to need a lot of frames before it realises it can switch to weaving fields for static unchanging pixels, and when it does, it is super sensitive to the slightest bit of pixel movement to trigger it out of weaving (such as a small amount of camera movement, or older content which has that constant film reel jitter). Even a very low end TV has better video mode deinterlacing than this. It's problematic for content with mixed field cadences and/or "bad edits" (dynamic field order changes) we need good video mode deinterlacing. I seem to have a vague memory of reading how per-pixel video mode deinterlacing is implemented very simply and cheaply on Broadcom STB chipsets, so I think there is a more efficient and effective way to do it.

Interlaced clip 1
Video mode deinterlacing works quite well with this clip, but only because the static unmoving parts of the image are perfectly still. But the alternating white and black 1px lines reveal a long ~500ms delay before weaving actually kicks in.

Interlaced clip 2
Video mode deinterlacing works quite badly with this clip as there is constant small amounts of movement of the frame which keeps kicking it out of weave deinterlacing and dropping back to effectively 240p resolution. Film deinterlacing works much better, but the content uses multiple cadences and some elements don't render properly, eg. the 60fps credits.

Ctrl+Alt+Shift+T to toggle between deinterlacing modes.

Last edited by flossy_cake; 21st May 2022 at 12:23.
flossy_cake is offline   Reply With Quote
Old 18th May 2022, 15:50   #63122  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,399
film mode is IVTC.
madVR doesn't have a deinterlacer it just used the GPU provided deinterlacer if that's bad you get bad results if it is good you get good results.

a good GPU driver deinterlacer is currently the best available deint we have for real time usage.
huhn is offline   Reply With Quote
Old 18th May 2022, 16:28   #63123  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 861
Quote:
Originally Posted by huhn View Post
you want a 4.4:4 source?
here we go:

take the HUD not the game content didn't work out as planned.

i wonder why the colors are messed up lavfilter was used to subsample BTW..
(apologies for the late answer)
Thanks, I know it's easy to get 4:4:4 by using video games as source, but I hardly watch video game content. That's why I said it's important to test using the type of content you watch the most, as different chrome upsampling algos can give different results depending on if you're watching video game/animation/live content.
It's pretty hard to find 4:4:4 live content samples.
Quote:
Originally Posted by Siso View Post
What about the increased rendering times with 113, and the broken SM?
Just FYI (based on madshi's explanation), the increased rendering times in v113 are actually closer to truth than the ones in v112 and before.
With earlier versions, madVR stats would show lower rendering times but it would start to drop frames at lower render times when comparing to frame time. v113 reports higher times comparatively, but can go closer to the frame time before it starts dropping because madshi optimised some stuff in the rendering.
It's sad wrt smooth motion crash for people who use it else v113 would be a nice test build for a lot of people.
__________________
HTPC: Windows 10 21H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 512.95
el Filou is offline   Reply With Quote
Old 18th May 2022, 16:43   #63124  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 156
Quote:
Originally Posted by huhn View Post
madVR doesn't have a deinterlacer it just used the GPU provided deinterlacer if that's bad you get bad results if it is good you get good results.
a good GPU driver deinterlacer is currently the best available deint we have for real time usage.
Thanks for clearing that up -- I had mistakenly thought MadVR was using its own custom pixel shader code to deinterlace.

I see there are options for software deinterlacing in LAV Video Decoder, so I can use those instead.

Curiously, LAV's video mode deinterlacing seems to work well, except on the black and white alternating 1px line patterns. Other 1px patterns work well though, just something about the black & white must be pushing it past some contrast threshold and tripping it out of weave. In video content such a pattern would typically not exist so it's ok I guess.

edit: to clarify, it seems LAV's "Weston" deinterlacing is just crude field interpolation at all times (aka bob deinterlacing). The "Yadif" one on the other hand does seem to intelligently switch between weave and bob for different zones of the raster, but it's still bobbing totally static pixels when it shouldn't be. And its "film" mode option isn't resolving alternating 1px black and white pattern either. Ctrl+arrows to step through in MPC-HC.

Last edited by flossy_cake; 18th May 2022 at 16:57.
flossy_cake is offline   Reply With Quote
Old 18th May 2022, 18:22   #63125  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,399
Quote:
Originally Posted by el Filou View Post
(apologies for the late answer)
Thanks, I know it's easy to get 4:4:4 by using video games as source, but I hardly watch video game content. That's why I said it's important to test using the type of content you watch the most, as different chrome upsampling algos can give different results depending on if you're watching video game/animation/live content.
It's pretty hard to find 4:4:4 live content samples.
what do you apologies for my new test is bad.
computer game's like this are very similar to anime at least.
we can also take source image from bbb and such.
the next issue is usually chroma doesn't matter as much with none computer/drawn sources.

i was even doing more test with: this:https://www.youtube.com/watch?v=ESx_hy1n7HA
where chroma scaler are day night different it's ridiculous i even found a PNG render from part of the image as a reference
thank to some simple searching the internet but not a simple wallpaper of this...
huhn is offline   Reply With Quote
Old 24th May 2022, 03:50   #63126  |  Link
tyner
Registered User
 
Join Date: Aug 2011
Posts: 5
Best of these Mobile Graphics for DVD Scaling & Efficiency via madVR?

Among the laptops IĀ’m looking to buy some will have these graphics

https://www.notebookcheck.net/GeFor.....247598.0.html

I think the 1650ti may be the oldest of these mobile versions, but a user of same said he could use some number of the better settings in madVR to scale DVDs to 1080p. ThatĀ’s what I want to.

But of the above three, how do they compare at running madVR for quality 1080p scaling of commercially issued (Warners, CBS Paramount, Universal, Fox) movies and ~ 60 minute TV show DVDs?

And with least power consumption, heat and fan noise?

That is, which one may have the best of both, power and efficiency, for this task?

From what I can make of the Cinebench and power consumption scores at the above link, the p2000 seems like the best choice. Yes?

Last edited by tyner; 24th May 2022 at 19:43.
tyner is offline   Reply With Quote
Old 24th May 2022, 15:50   #63127  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 156
Quote:
Originally Posted by huhn View Post
madVR doesn't have a deinterlacer it just used the GPU provided deinterlacer if that's bad you get bad results if it is good you get good results.
Are you definitely sure about this?

I'm asking because on that Cowboy Bebop clip with [deint=Film], the Ctrl+J screen is actually reporting the cadence changes in realtime as I'm playing the intro sequence, as if to imply MadVR is detecting the cadence & is somehow involved in deinterlacing. Otherwise there would have to be some NVidia API where it's querying the cadence from the GPU driver?
flossy_cake is offline   Reply With Quote
Old 24th May 2022, 17:59   #63128  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,399
yes i am because that is inverse telecine not deinterlacing.
and it does not work on interlaced source only telecine sources.

yes madVR has it's own field matching algorithm IVTC.

a modern deinterlacer in a TV does both at the same time BTW.
but interlaced output is not really possible with madVR and in general kind special to do on a PC. interlaced mode Chroma scaling(there is a more official name for that i don't know it) and image scaling is not use anyware as far as i know.
huhn is offline   Reply With Quote
Old 24th May 2022, 19:41   #63129  |  Link
tyner
Registered User
 
Join Date: Aug 2011
Posts: 5
Would someone kindly reply to my post? As this is the madVR forum I'd expect it to be the best place for answers to my related hardware questions.

Last edited by tyner; 24th May 2022 at 19:51.
tyner is offline   Reply With Quote
Old 24th May 2022, 20:34   #63130  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,399
your link is dead.
laptop hardware is not easy to compare because the name of the GPU has no meaning how it is configurated.

you care about noise and power consumption but madVR is not power efficient it's hard if not impossible to judge a cooling system of a laptop they are not build to be silent under load.
huhn is offline   Reply With Quote
Old 24th May 2022, 20:50   #63131  |  Link
Sunspark
Registered User
 
Join Date: Nov 2015
Posts: 177
tyner, this is a hobbyist forum and non-official support. This product isn't publicly developed anymore and the developer hasn't posted here in years as he is working on another project now.

For the purpose you indicated of scaling interlaced 480 DVD to 1080p, any device even a little ARM device can do it. Even your monitor can do it. Higher quality settings, sure, cpu and gpu plays a role there, but your source file is only 480i and you'll be able to do a lot with any machine these days, so focus on the other factors that might be good for the other uses of the device. Screen, keyboard, etc.
Sunspark is offline   Reply With Quote
Old 24th May 2022, 21:35   #63132  |  Link
tyner
Registered User
 
Join Date: Aug 2011
Posts: 5
Quote:
Originally Posted by huhn View Post
your link is dead.
laptop hardware is not easy to compare because the name of the GPU has no meaning how it is configurated.

you care about noise and power consumption but madVR is not power efficient it's hard if not impossible to judge a cooling system of a laptop they are not build to be silent under load.
FWIW, does this work? https://www.notebookcheck.net/GeForc....247598.0.html
tyner is offline   Reply With Quote
Old 24th May 2022, 21:53   #63133  |  Link
tyner
Registered User
 
Join Date: Aug 2011
Posts: 5
Quote:
Originally Posted by huhn View Post
laptop hardware is not easy to compare because the name of the GPU has no meaning how it is configurated.

you care about noise and power consumption but madVR is not power efficient it's hard if not impossible to judge a cooling system of a laptop they are not build to be silent under load.
To quote the user:

"A very humble PC can play back DVD to 1080p. Coffee Lake iGPU, from what I read, should max out at about 4K UHD @ 24fps. I played a DVD on a laptop that has an NVidia GTX 1650, at 1080p over HDMI, using MadVR. I couldn't quickly find a setting that was too demanding. It was able to use Jinc to upscale chroma and image, with anti-ringing filter. A similar card would be around $220, and is possibly overkill at that.

This may really be a question of tradeoffs among size, noise, performance, and maintainability (e.g. the ability to open the case). Even with all that in mind, the Celeron in the little industrial PC is good enough that I don't bother to play back DVDs on the much-more-powerful laptop. Depends how far you want to take it, in which direction.
"

Considering the rest of his apparent hardware besides the 1650ti, it seems that he was able to use at least some of the upper madVR algorithms and yet didn't mention issues with fan noise and heat. My my laptop will likely be an HP 15" Zbook, SDD, 35 watt Comet Lake Xeon cpu. So I thought why not experiment between using the iGPU and the best of those three dedicated graphics. If the results balance well enough between scaling quality vs. heat/noise results with either iGPU or the discreet GPU then fine. Alternately, maybe the upgraded JRVR scaler in latest JRiver version?
tyner is offline   Reply With Quote
Old 24th May 2022, 22:04   #63134  |  Link
tyner
Registered User
 
Join Date: Aug 2011
Posts: 5
Quote:
Originally Posted by Sunspark View Post
tyner, this is a hobbyist forum and non-official support. This product isn't publicly developed anymore and the developer hasn't posted here in years as he is working on another project now.

For the purpose you indicated of scaling interlaced 480 DVD to 1080p, any device even a little ARM device can do it. Even your monitor can do it. Higher quality settings, sure, cpu and gpu plays a role there, but your source file is only 480i and you'll be able to do a lot with any machine these days, so focus on the other factors that might be good for the other uses of the device. Screen, keyboard, etc.
Agreed, but like with audio, everything counts. Thus, IF the processor in this year's Sony 48" to 55" OLED TV I may go with leaves my best DVDs looking too soft (which may not be surprising having to interpolate loads of missing data from a 480 image to intelligibly fill a 4K screen), then might feeding it a signal that's first scaled to 1080p yield an overall better looking image?

I'm not, of course, therefore, expecting perfection but I just thought that as I'm looking to buy a fairly muscular laptop for the long term (see last post) why not go for discrete graphics device that strikes the best balance between scaling performance and power draw.
tyner is offline   Reply With Quote
Old 24th May 2022, 22:09   #63135  |  Link
lvqcl
Registered User
 
Join Date: Aug 2015
Posts: 236
Quote:
Originally Posted by tyner View Post
Agreed, but like with audio, everything counts.
What do you mean?
lvqcl is offline   Reply With Quote
Old 25th May 2022, 19:29   #63136  |  Link
el Filou
Registered User
 
el Filou's Avatar
 
Join Date: Oct 2016
Posts: 861
Quote:
Originally Posted by tyner View Post
From those three, the 1650 Ti is better on paper because if you have more shader cores you can run the same workload at lower clocks for better efficiency and lower heat/noise, or run a higher workload, but as huhn said with laptops you need to see how the GPU is configured.
You could have one laptop with a 1024 cores part but badly designed cooling and it could get noisier than another with 768 cores running higher clocks. Some laptop manufacturers also limit the TDP and therefore the performance.

I have a 1050 Ti that is passively cooled in my HTPC and I can use NGU high to upscale DVDs and even the older HDR tonemapping to watch UHD BDs, but then its cooler is massive so I have no idea how that would translate to a laptop's performance (something lihgter like Jinc or Lanczos would be easy I think).
__________________
HTPC: Windows 10 21H2, MediaPortal 1, LAV Filters/ReClock/madVR. DVB-C TV, Panasonic GT60, Denon 2310, Core 2 Duo E7400 oc'd, GeForce 1050 Ti 512.95

Last edited by el Filou; 25th May 2022 at 19:39.
el Filou is offline   Reply With Quote
Old 27th May 2022, 10:01   #63137  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 156
Quote:
Originally Posted by tyner View Post
Agreed, but like with audio, everything counts. Thus, IF the processor in this year's Sony 48" to 55" OLED TV I may go with leaves my best DVDs looking too soft (which may not be surprising having to interpolate loads of missing data from a 480 image to intelligibly fill a 4K screen), then might feeding it a signal that's first scaled to 1080p yield an overall better looking image?
imo the low resolution of DVD seems to benefit a lot from MadVRs sharpening options (processing > image enhancements). I mentioned some sharpening presets I was working on here -- AdaptiveSharpen medium (strength = 0.4) seems to be my current preference if the source is a bit soft.

But the sharpening algorithm of my TV is interacting with it as well as I don't run 0 sharpness on my TV, so what I'm seeing isn't necessarily what you'll see.

Another factor is whether your PC is outputting 480p or 1080p/2160p. If the former, then your TV will probably do pre-resize sharpening to the 480p, and those sharpening artefacts will get enlarged when the TV upscales which tends to make the resulting image look much sharper and you might not feel the need to add any sharpening in MadVR.

Deinterlacing options will be important as well -- probably best to tick "automatically activate when needed" and untick both "disable automatic source type detection" and "only look at pixels in frame center". This is based on my assumption that MadVR's cadence detection is competent, but I haven't really checked it with enough sources to confirm that yet. DVDs and old content can sometimes have all kinds of janky cadence changes which can trip it up. I only have a few interlaced sources in my collection and I know in advance whether they are film or video based so I'm using the [deint=Film/Video] tag to the file/folder to force the best deinterlacing mode. By the sounds of it you have a large DVD collection and this could be too labour intensive to manually tag them all, hence my recommendation to use MadVRs auto detection.

Last edited by flossy_cake; 27th May 2022 at 10:07.
flossy_cake is offline   Reply With Quote
Old 27th May 2022, 19:41   #63138  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,399
auto detection doesn't work. if you have doubts you should use the deinterlacer.
huhn is offline   Reply With Quote
Old 28th May 2022, 08:59   #63139  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 156
Quote:
Originally Posted by huhn View Post
auto detection doesn't work.
I just checked it again now and you're right -- it doesn't work. It seems to just use video mode deinterlacing the whole time regardless of the source cadence.

Not sure why I didn't detect this earlier. I must have been looking for interlace mice teeth and not seeing them and thinking that must have meant it was working. When in reality it was just video mode deinterlacing.

So manually tagging parent folders or individual files with [deint=film/video] will be necessary to get the optimal deinterlacing. Although this is labour intensive, it's still much better than having to remember to manually change the setting in the GUI for every file, and it only needs to be done once per file. The fact that it works for parent folders is also a huge time saver. Most content tends to be film-based, so tagging top level folder with [deint=film] should suffice. However, for those exceptions when the content is video-based, it is somewhat annoying that tagging the file itself does not override the parent folder's tag. Although this could be worked around by placing all video-based files in a separately tagged folder.

Last edited by flossy_cake; 28th May 2022 at 09:13.
flossy_cake is offline   Reply With Quote
Old 31st May 2022, 15:14   #63140  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 156
The answer to this is most likely "no", but is it possible to have MadVR forcibly downscale the video resolution?

I've got some old TV shows which are HD 1080p remasters, but the shows were made before HD was a thing, and I'm not sure the artistic intent is quite correct in HD. It feels like I'm seeing too much. I suppose an analogy might be like running PS1 games in HD or something. I mean it's technically better, but also different. I'd like to try downscaling these shows to around 480p or 576p internally if possible.

flossy_cake is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 12:05.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2022, vBulletin Solutions Inc.