Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 23rd November 2017, 14:15   #47281  |  Link
foozoor
Registered User
 
foozoor's Avatar
 
Join Date: Feb 2012
Posts: 116
It seems that FSRCNNX is better than NGU Standard/Sharp now.
foozoor is offline   Reply With Quote
Old 23rd November 2017, 17:37   #47282  |  Link
njfoses
Registered User
 
Join Date: Feb 2012
Posts: 44
Quote:
Originally Posted by j82k View Post
Isn't there some registry hack or something like that to prevent windows HDR from automatically turning on so we can use newer nvidia drivers?
You can sue the newer drivers but will have to manually turn off HDR after viewing.
njfoses is offline   Reply With Quote
Old 23rd November 2017, 17:51   #47283  |  Link
Frexxia
Registered User
 
Join Date: Jan 2017
Posts: 8
Quote:
Originally Posted by foozoor View Post
It seems that FSRCNNX is better than NGU Standard/Sharp now.
What is FSRCNNX? The only thing I can find with google is a 4chan thread about mpv.
Frexxia is offline   Reply With Quote
Old 23rd November 2017, 17:53   #47284  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,340
Quote:
Originally Posted by Frexxia View Post
What is FSRCNNX? The only thing I can find with google is a 4chan thread about mpv.
https://github.com/igv/FSRCNN-TensorFlow
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 23rd November 2017, 17:59   #47285  |  Link
Xorp
Registered User
 
Join Date: Jan 2009
Posts: 56
Is a dedicated graphics card pretty much always required for NGU upscaling? I’m a fan of the Intel NUCs, but would they prevent me from using NGU Sharp to upscale Blu-rays to 4K, with a few other features like debanding? Does a Celeron or i3 with a Nvidia 1060 out perform a i7 NUC with no dedicated GPU? I’d like to buy a new NUC for my next HTPC, but if a i7 version is required to use NGU and other features, or if it’s not possible with any NUC, I’d like some suggestions.

Last edited by Xorp; 23rd November 2017 at 18:02.
Xorp is offline   Reply With Quote
Old 23rd November 2017, 18:18   #47286  |  Link
sat4all
Registered User
 
Join Date: Apr 2015
Posts: 62
it's not possible with any current nuc.
i ditched mine and got a zotac zbox en1060k, ngu sharp, hdr.. work flawlessly.
__________________
ZOTAC MAGNUS EN1060K: Win 10 x64 + Kodi DSPlayer x64
LG OLED65C8 / Denon AVR-X3200W / KEF E305+ONKYO SKH-410 / Synology DS2415+ / Logitech Harmony 950
sat4all is offline   Reply With Quote
Old 23rd November 2017, 20:08   #47287  |  Link
cccleaner
Registered User
 
Join Date: Oct 2017
Posts: 1
388.13 back to 385.69

Quote:
Originally Posted by clsid View Post
I have heard that this problem is going to be fixed in next Win10 cumulative update (next week).

Until then you can fix it by "disabling fullscreen optimizations" in the Windows compatiblity settings of your player.
Is that working? Just downgraded once again the nvidia drivers (388.13 to 385.69) to use HDR in my MPC -BE, LAV, madVR setup.. I have had no success so far since 385.69 to get to run 4k HDR content in fullscreen with this setup on windows 10 64 bit nvidia 1070. my windows crashes incredibly harsh going fullscreen. only a forced task manager (somehow) brings the OS back alive or reset and hardware reboot.

any feeback before I start another driver odyssey again would be so appreciated :-)
cccleaner is offline   Reply With Quote
Old 23rd November 2017, 20:50   #47288  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 447
Quote:
Originally Posted by cccleaner View Post
any feeback before I start another driver odyssey again would be so appreciated :-)
Not sure as I don't use fullscreen myself, but the cumulative updates notes do say "Addressed issue that causes a black screen to appear when you switch between windowed and full-screen modes when playing some Microsoft DirectX games."
Ver Greeneyes is offline   Reply With Quote
Old 23rd November 2017, 22:32   #47289  |  Link
j82k
Registered User
 
Join Date: Jun 2017
Posts: 155
Quote:
Originally Posted by njfoses View Post
You can sue the newer drivers but will have to manually turn off HDR after viewing.
The problem is I can't even get HDR working properly at all with any driver above 385.XX.
All I get is windows HDR automatically turning on which results in messed up colors.
j82k is offline   Reply With Quote
Old 23rd November 2017, 22:33   #47290  |  Link
dRumMzZ
Registered User
 
Join Date: Nov 2017
Posts: 5
Quote:
Originally Posted by Oguignant View Post
Is there any way to block automatic updates of the Nvidia drivers that Windows Update installs? they drive me crazy with the new version that does not work with madvr/hdr
What is the last version that you think works with madVR? And what stopped working for you exactly?
I'm having a problem getting madVR to play a blu-ray with the "top and bottom" format.
dRumMzZ is offline   Reply With Quote
Old 23rd November 2017, 23:11   #47291  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 652
Quote:
Originally Posted by njfoses View Post
You can sue the newer drivers but will have to manually turn off HDR after viewing.
If only we could sue them...

(Joking, no grammar nazi, just joking)
ashlar42 is offline   Reply With Quote
Old 23rd November 2017, 23:30   #47292  |  Link
Polopretress
Registered User
 
Join Date: Sep 2017
Posts: 46
Quote:
Originally Posted by Soulnight View Post
Hello all,

Same question here.
I created a *.bat file calling my Harmony hub to control my projector.
The *.bat is working but I can't seem to make it work with this Madvr function.

The idea is genius though...

Any ideas how to make it work?
Is the functionality even "functional" ?

Thank you to anyone being to give hand on this topic.
Florian
I agree. That would be great to make the option "command line to execute when this profile is activated/deactivated "works
Polopretress is offline   Reply With Quote
Old 24th November 2017, 00:33   #47293  |  Link
steakhutzeee
Registered User
 
steakhutzeee's Avatar
 
Join Date: May 2015
Posts: 225
I know this is nooby but i've two questions, i use mpc be, lav and madvr.

Now, i think this is some sort of zoom settings in the madvr settings. I'm watching a move right now, i'm on 16:9 monitor and movie is 1:85,1. So i've a little black bars on top and bottom, just a few pixel.

1- In some scenes there are the records of an old camera, just a vertical image showed and black bars left and right. During these scenes the video is fitting the screen, like zooming, to perfectly 16:9. Just to turn back when the scene ends. Why? What options take me to achieve this?

2- In zoom control in madvr can you explain to me what the option "disable scaling if image size changes by only:". I'm newbie :/

Thanks!
__________________
Intel i5-4590 - MSI R9 270X 2GB - 8GB RAM
steakhutzeee is offline   Reply With Quote
Old 24th November 2017, 02:47   #47294  |  Link
heiseikiseki
Registered User
 
Join Date: Jan 2015
Posts: 37
Quote:
Originally Posted by Xorp View Post
Is a dedicated graphics card pretty much always required for NGU upscaling? I’m a fan of the Intel NUCs, but would they prevent me from using NGU Sharp to upscale Blu-rays to 4K, with a few other features like debanding? Does a Celeron or i3 with a Nvidia 1060 out perform a i7 NUC with no dedicated GPU? I’d like to buy a new NUC for my next HTPC, but if a i7 version is required to use NGU and other features, or if it’s not possible with any NUC, I’d like some suggestions.
Wait for intel i7-8809g.
By the Benchmark , It is as powerful as the GTX1050TI.
And I think intel must use it for the new NUC
heiseikiseki is offline   Reply With Quote
Old 24th November 2017, 03:10   #47295  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
which is rumoured to be a polaris card so you should wait and see if it has the same problems as polaris.

and to be totally honest i'm not sure if is possible to cool these chips silently in a nuc factor case the GPU alone will have a TDP of 60-100 watt add the CPU to that and you have quite a hot chip...

a g4560 and a 1050 Ti is a cheap solution for a working HTPC.
huhn is offline   Reply With Quote
Old 25th November 2017, 13:56   #47296  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by amayra View Post
i have problem madvr drop performance everytime i turn on "use Direct3D 11 for presentation (Windows 7 and newer)" i have glitch repeating frames and to solve this i need to reduce quality option but with some setting work flowless with D3D9D
and this happend with HD 4000 with i7-3770
Quote:
Originally Posted by amayra View Post
weird it work fine in MPDN this happen only with madvr some old ver work fine to me but i don't remember which one
madVR currently always renders in D3D9. So if you tell it to use D3D11 for presentation, it has to use 2 different D3D devices (one v9 and one v11) and the devices have to share their textures. This works well for AMD and Nvidia, but the Intel driver doesn't seem to like it very much. I'm not sure what MPDN does internally, maybe it also renders with D3D11, then it doesn't have to do the texture sharing.

In some future version I'm going to switch to D3D11 everywhere, then I don't have to share textures, anymore, which will probably improve the situation for Intel GPU users. However, this will mean losing compatability to XP and Vista.

Quote:
Originally Posted by mzso View Post
Will you still look into the freeze reports, or is it forgotten already?
Have you read the first sentence in the v0.92.9 and v0.92.8 release notes? It seems you haven't.

Quote:
Originally Posted by oldpainlesskodi View Post
Unless I am going blind, I can see other changes in the PQ beyond the new RCA....
There shouldn't be any changes except RCA.

Quote:
Originally Posted by Cinemancave View Post
On another note - I saw some comparison screenshots of blu-ray upscaled with NGU compared to UHD bluray that only has 2K Digital Intermediate - and the NGU ones were clearly better. So congrats Madshi - you officially beat Hollywood.


It's not a big surprise, though. I've been told that often UHD Blu-Rays are simply upscaled using Catmull-Rom. It's not really hard to beat that.

Quote:
Originally Posted by scollaco View Post
Now I also use madVR for all my personal movie playback and I'm wondering what is the output to set? I have an Intel Nuc7i5BNK with Windows 10 and I set the Graphics Card to "Full Range", madVR output to RGB (0-255) and then my projector is set to Auto. Everything looks correct to me in terms of black levels etc...but I'm wondering if what I have set is the best possible setting to avoid banding and any other issues. Currently my projector reports

3840x2160/24p
RGB
BT.709


I also have the option on the Graphic card to set YCbCr. What is the the BEST output to the projector through madVR with my current HTPC?
Definitely RGB. Make sure you DON'T disable dithering in the madVR settings. Also, if I may suggest, run the madLevelsTweaker to force your GPU into 0-255 mode. The manual option does work, but on my PC at least there's still some banding. It goes totally away for me when forcing the GPU into 0-255 through madLevelsTweaker. This only applies for Intel GPUs, though.

Quote:
Originally Posted by Rippner View Post
hi! witch gpu I need for watch uhd movies with madvr default settings? I have an i5-3450
there is a lot of difference between the minimum and maximum settings of madvr?
First of all you need a GPU which can decode HEVC in hardware, which means Polaris or newer, or Pascal or newer. I think Kaby Lake also works. This is just for *decoding*. In LAV Video Decoder choose DXVA copyback or DXVA native or D3D11 (requires nightly LAV build).

Next step is rendering: You can make a Kaby Lake GPU render UHD movies smoothly, but I'm not sure if it works with the default settings, you may have to tweak it a little to get it smooth. Any Polaris or Pascal GPU should have no problem with the default settings.

Difference between min and max settings in madVR is relatively small if you don't need to upscale, and if you don't have a need to use algorithms like "reduce compression artifacts". When doing upscaling, or when playing heavily compressed videos, higher madVR settings can bring a very noticeable improvement.

Quote:
Originally Posted by Ed Riffles View Post
I have currently been experiencing and issue with Madvr where the Render queues and the present queues are not fillig when using madvr to upscale to 3840x2160p, this leads to frame drops.
I have tried reinstalling the Lav Filters, MPC-HC, and Madvr multiples times in order to try and fix the issue. When this did not work i then decided to try DDU-ing my display drivers and installing and older version, that did not work either.
The only time i was able to get the Render queues and present queues to fill up was when i disconnected my second display off of my 980ti. My render queues and present queues then went up to 8 / 8. No more frame drops. Now i do not want to have to disable my second monitor in order to upscale movies and television with Madvr.
I then tried running the second monitor off of the internal igpu, i have an igpu630, with this when upscaling i was able to get 3-5 out of 8 render queues and present frames but was still dropping frames. This was not satisfactory. As of Right now i am unable to fix this.
OS Windows 10 pro latest Hardware: Intel core i5 7600k running at 4.0ghz, 980ti running at stock clocks, 16gbs of ram.
Software: Madvr version 0.92.7 (settings https://i.imgur.com/17BC4YZ.png https://i.imgur.com/nt4rruu.png https://i.imgur.com/BYsdkBp.png https://i.imgur.com/PDNgB44.png https://i.imgur.com/0b0sWfH.png https://i.imgur.com/knvDYar.png https://i.imgur.com/4xbll0A.png https://i.imgur.com/3D267hA.png ), Lav Filters 0.71.0 (video decoder settings https://i.imgur.com/z98BYcH.png), MPC-HC 1.7.13.112
As ryrynz suggested, try using Adaptive power settings. If that doesn't help, your comments with the secondary monitor make me think that maybe enabling automatic fullscreen exclusive mode might fix the problem for you.

Quote:
Originally Posted by TheProfosist View Post
@madshi I found a bug with the screenshots. I have 2x render resolution set for the screenshot. I have crop enabled. The source is a BD with 4x3 video. It is pillarboxed 1080p. So the video is 1920x1080 and its then cropped to ~1440x1080 but the screenshots get saved as 3840x2160. Since the image is cropped it ends up being stretched abnormally. Let me know if you need/want an example screenshot.
Yes, I'm aware of the problem. To be fixed in a future version. You can workaround it for now by disabling the "crop black bars" option.

Quote:
Originally Posted by Cinemancave View Post
"Out of Memory bug"
On another note, I wanted to play a 3D movie yesterday, and when I went back to watching 2D I got the red "Out of Memory" message when entering fullscreen. I've searched and found that this is a problem people have been having for years - but I cannot find a satisfactory solution? If I turn off stereoscopic 3D under Nvidia CP, the bug goes away. But then I have to always run the 3D Wizard before I play a 3D movie, and I would really like to avoid having to do that. Btw, I have 24 GB RAM and 11 GB VRAM, so that can't be a problem. I am running Win 10 with Creator's update. Does anyone know how to fix this? There is already an old bug report about this, so I don't want to create a new one if this has already been fixed.
Have you tried disabling 3D in the Nvidia CP, and telling madVR to auto enable/disable 3D for you for 3D movies?

Quote:
Originally Posted by VAMET View Post
Nope, I don't think it is HDMI related issue. I have also good quality cables and device. And it's only happen, when 10bit is set in madVR and 12bit in NVIDIA Control Panel (there is no 10bit option there).
If it only happens if you activate 12bit in the Nvidia Control Panel then that's a very strong indication that it's most probably not madVR's fault. It could be anything, from your HDMI cable or the HDMI output port of your GPU or the input port in your TV or receiver not being able to handle the higher bandwidth reliably, or maybe the pixel clock happens to have a frequency which makes your TV think it should resync once in a while (but I'm not sure why that should only apply to 12bit). Do you see any frame drops/repeats or presentation glitches in the madVR OSD in the moment when this happens (or directly afterwards)?

Quote:
Originally Posted by Anima123 View Post
RCA medium plus NGU standard or sharp produces very pleasant visual effect when playing good 720p materials, which is amazing.

I used to avoid NGU standard or sharp, due to the unnatural overly sharped border effect.
Glad to hear that!

Quote:
Originally Posted by jkauff View Post
I'm watching one of my noisiest Blu-ray movies, Red River. Render times got too high using both RCA and RRN at 3, and I didn't see very much effect anyway (rig is GTX 1080 driving a 4K monitor), so I disabled RCA and upped RRN to 6. The effect is amazing, almost all the noise is gone and the picture looks great (maybe a slight loss in detail, haven't played around with that yet).

Excellent work, madshi! It's not quite up to NLMeans denoising, but it's in real time! madVR just gets better and better. Congratulations on yet another great algorithm.

EDIT: Applying Crispen Edges and Enhance Detail while doubling restored the detail.
Glad to hear you like it! Would you mind posting a little comparison (maybe one or two images) that show how NLMeans is better than RRN? Just for my interest.

You may want to try using RCA instead of RRN, it seems users like RCA better for some reason. But if your main goal is to reduce grain, then I suppose RRN might be the better choice.

Quote:
Originally Posted by Neo-XP View Post
I tested AS again, with and without LL, and for films at least LL is not good, because for the same sharpness effect there are a lot more ringing and dark halos with LL on and no other improvements. For instance, I had to increase the AS value to 0.6 with LL on to get (almost) the same sharpness than AS 0.2 without LL.
Well, LL on/off sharpen different kinds of edges with different strengths, so in some parts of the image LL off will sharpen more than LL on, and vice versa. I'm surprised you got halos, though. Do you happen to have a couple good test images where I can see why you prefer LL off? That might help with development.

Quote:
Originally Posted by Neo-XP View Post
Ok so I did some tests with a true UHD source (Passengers). I have found the UHD image here : http://madvr.com/doom9/passengers/PassengersUHD.png

When comparing the two upscaled images to the UHD one, the old NGU Sharp image is closer to the UHD one because the little details are better processed and there are less artifacts (ringing, dark halos, aliasing, etc.) around the edges.
Of course, you need to zoom quite a lot to see this.

Also, there is 5.28% difference between the UHD image and the image upscaled by the old NGU Sharp and 5.40% with the new NGU Sharp. Tested here : https://huddle.github.io/Resemble.js/
For information, there is 6.11% difference when comparing the UDH image with the FHD image upscaled to UHD by Lanczos3 AR.

Now I guess the goal is to find the right settings to match the UHD image with the less % difference, with a lot of images.
I do consider going back to the old NGU Sharp. FWIW, a lot depends on whether the studio downscaled the 4K master to 1080p in linear light or gamma light. It seems that for Passengers, the studio used gamma light. But that might not be the case for all 1080p Blu-Rays. Which is the key reason why I changed NGU Sharp.

Quote:
Originally Posted by x7007 View Post
When do I need to use P010 10 bit in Lav Filters ?

are there specific movies ? do I need to unselect all the 8 bit output format in Lav Video Filter ?
You should leave everything checked, which is the default LAV setting. LAV and madVR will communicate and automatically pick the best format.

Quote:
Originally Posted by Ver Greeneyes View Post
I know you posted this a while ago, but I'm curious about something: have you considered adding some sort of (opt-in) pixel art resolution detection? Gameplay videos for older games usually aren't released in the original resolution, but rather 720p or 1080p, with nearest neighbor upscaling on each pixel already encoded into the video. For NGU AA upscaling to do anything you'd first have to determine the original resolution and 'downscale' to that resolution. I think it would be very interesting to see how something like that would look. I guess one could use an AviSynth script to do this as a preprocessing step, but you'd have to either tailor it to the video or do the detection yourself.
Technically possible, but only useful for max 1% of the madVR users (probably far less), so not something I plan to invest time into at this time.

Quote:
Originally Posted by scollaco View Post
I'm having trouble getting HDR to kick in through madVR.

- I have a Sony 4K Projector (does HDR10)
- An intel Nuc7i5BNK (I installed latest video drivers just released 2 days ago)
- I set "HDR and Advanced color" to ON in Display settings (Since I don't have Nvdia graphics card)

In MadVR, I've tried every setting you can imagine. HDR is set to pass-through to display along with metadata checkbox turned on. In general settings, FSE mode ends up showing black. If FSE is turned off, video is just washed out. I've tried using D311 and no change.

NOW...If I load the movie through Windows 10 movie app, HDR works instantly and looks great! What setting in madVR have I not tried?
I also set LAV filters to the default settings...(Ordered Dithering, bit untouched as input, DXVA etc....)

Thanks for any help provided. It's great that HDR is working through the windows app but I want it through the amazing madVR
Does your projector report if it's in SDR or HDR mode? Does it clearly say it's in HDR mode when you use the Windows 10 movie app? I'd recommend that you switch madVR to "let madVR decide" instead of "passthrough".

Quote:
Originally Posted by Paul Tronc View Post
My new projector is not good at switching from SDR to HDR and vice versa. Is it possible to have madvr "upscale" all content to 4:4:4 10bit HDR? I could'nt make this, I could'nt even upscale from 4:2:0 to 4:4:4. My projector can work with 4:4:4 10bit HDR content, and I'm using the last windows 10 update with the HDR feature (don't know if it changes anything to the problem).
You can set the OS switch "HDR and Advanced Color" to "on". That way your projector will always receive HDR. It's not my cup of tee, though. It's not really good for SDR playback quality.

Quote:
Originally Posted by maiden View Post
Please excuse me if I am in the wrong area.

I am needing help with Madvr settings.
I use jriver on windows 10 machine with specs as Intel Core i5-4590 CPU, Haswell Gen2, LGA1150, 3.3GHz 6 DDR3/ 2x PCIE3.0 x16 Windows 7 64 bit Asus Z97-DELUXE ATX Motherboard Nvidia GForce GTX750TI Receiver Onkyo TX-NR925 TV LG LF6300 55" smart TV

I have madvr as defaults but I am getting an odd twitch in 23.98 material.
Can someone help with settings to try and eliminate this annoyance.
Can you describe the "odd twitch" in more detail, please? And does the madVR OSD (Ctrl+J) show any increased frame drop/repeats or presentation glitches in the moment when this "odd twitch" happens?

Quote:
Originally Posted by MokrySedeS View Post
@madshi, can you improve black bar detection to include vertical videos?
Spotify started releasing them recently and madVR isn't detecting black bars at all for me.
Here's an example: https://www.youtube.com/watch?v=pz95u3UVpaM
And here's a 30s sample: http://www13.zippyshare.com/v/PtlV6yqH/file.html
Argh. It's possible, and not hard to do, but madVR would have to scan a much larger area of the video to detect this, which means reduced performance. I'm not sure if it's worth it at this point in time where the detection still runs on the CPU. Maybe when I move the algo to the GPU?
madshi is offline   Reply With Quote
Old 25th November 2017, 14:00   #47297  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by ABDO View Post
After quick look, i think (RCA high) can keep some image details deeper and alittle bit noticeable than (RCA med), and the quality difference become biger noticeable if we add some sharp after image upscaling.

definitely (RCA high) is nice quality option for every one looking for quality befor Anything else.

edit:
in very bad real world Source, (RCA high) can treat the edges lines better than (RCA med).
Quote:
Originally Posted by ryrynz View Post
Didn't see much of a difference at low strengths.
Quote:
Originally Posted by Neo-XP View Post
Option "high" looks a little better, but needs more GPU resources than I have to spare for >= 720p sources.
Quote:
Originally Posted by Werewolfy View Post
At low strenghts I can't see a difference. At high strenghts, I can see a difference but I can't determine if it's really better. At very high strengths it's a little bit better.
Thanks for the feedback, it matches my own test results. I wonder if I should maybe remove the "quality" option and auto pick, based on the strength?

Quote:
Originally Posted by Gopa View Post
RCA: I almost always end up using, strength 10/high quality. Lower or higher strength, or medium quality, not usually very effective, for most of my anime (mostly average quality). Almost always worth, the slower rendertime, with high quality setting.
RCA strength 10/high quality (more important to me, than high luma settings). Thank you!
You will love the next madVR version!

Quote:
Originally Posted by d3rd3vil View Post
So whats the most realistic outcome atm regarding Dolby Vision?
As long as no splitter/decoder delivers the DV information to madVR, the definite outcome is that it will never be supported. So it doesn't make any sense to bug me in this thread about it. If you want DV to be supported, first a splitter/decoder would have to deliver the information to madVR!

Quote:
Originally Posted by ryrynz View Post
When the average madVR user starts using Avisynth processing filters and madshi adds inbuilt madVR configs for them for ease of use, things are gonna go nuts.
Just using the GPU won't sometimes be enough, we're already seeing the best cards taxed to the limit with UHD content, this will allow people to get the most from their systems combined processing power.
FWIW, CPUs are great at doing a lot of things, but they're not great for doing image processing. Most image processing algos that a CPU would run at near 100% usage in AviSynth would be a complete piece of cake for any budget GPU. Because of that adding AviSynth support would IMHO not really help much with improving performance. It would help getting access to more algorithms, though. I simply can't possibly port all AviSynth algos to madVR!

Quote:
Originally Posted by rancorx2 View Post
does anyone know how to avoid the video from freezing after pausing for a while and then trying to resume it?

doesn't happen on evr but whenever i pause a video when madvr is the renderer, playback does not resume or takes a few minutes for the video to continue, the audio plays but not the video straight away, most of the time i have to close mpc and reopen the video.
Haven't seen this yet, but then I don't usually keep video paused for a long time. I'm aware of problems if the PC goes to sleep/suspend, but not when pausing a long time. Anyway, it doesn't really sound like a showstopper problem, so it's not high priority for now.

Quote:
Originally Posted by yukinok25 View Post
Hi madshi, I keep having crashes every time I move the seekbar back and forth, it started to happen from the 0.92.3/4 if I recall correctly.

When you have time, can you please check my crash report?:

http://www.mediafire.com/file/t7ahmp...ash+report.txt

Thank you very much!
It looks like a crash in the Nvidia OpenCL driver. Try using NGU-Anti-Alias instead of NNEDI3. That should fix the problem. NGU AA looks better than NNEDI3 IMHO, anyway, and is faster at the same time.

Quote:
Originally Posted by sat4all View Post
Any plans for the near future, to move black bars detection from cpu to gpu. as now with hdr content copy-back decoding put too much load on the cpu. arround 70% in my case ''i5-7500T''.
It's planned for the future, I don't really give out ETAs.

I can see copy-back being problematic for 4Kp60, but shouldn't it still work ok for 4Kp24?

Quote:
Originally Posted by nevcairiel View Post
Thats mostly Windows fault, stick to 8.1 and everything works perfectly, even with brand new drivers.
YES!

Quote:
Originally Posted by Epedemic View Post
That aside, i have a question: Can anyone point me in the direction of a guide/some settings for madVR/LAV for passing through 2160p/HDR content with as little GPU-taxing enhanchments and features as possible? The GT 1030 is *barely* able to play the content without dropping frames, but is VERY sensitive to any background activity on the PC. 60fps content especially will drop frames now and then. (I use Mediaportal 1.18 as my preferred player. Kodi DSPlayer seems a little better in performance, but is not really what i'm looking for, as i have a lot of stuff, especially TV, which is working much better on Mediaportal)
Have you tried setting LAV (nightly) to "D3D11" decoder? Maybe it helps.

Quote:
Originally Posted by mclingo View Post
I've switch to an NVIDIA 1050, no crashes but I cant get rid of frame drops every 3-4 mins whatever I do.
Please upload a screenshot of the madVR OSD (Ctrl+J) which shows the frame drops to some image sharing site, then link to it here. Maybe we can see something.

Quote:
Originally Posted by Fabulist View Post
Hello people. Is there a consensus regarding the anti-bloating filters on image enhancements/sharpeners - for example does 100% anti-bloating mean 100% elimination of any possible bloating, 150% means it could over anti-bloat, and 50% is half the anti-bloating required? Is there a safe value?

Is that how it works or what is the logic behind it? I am asking this because I am noticing an inconsistency between sharpeners, especially with Sharpen edges, it seems the filter does not bloat at all at even at 4.0 on HD and above sources, thus activating anti-bloating simply kills quality without any apparent bloating repair.
Some sharpeners bloat more than others. E.g. "crispen edges" doesn't bloat by design, but LumaSharpen does. Using anti-bloating removes the bloating, but that also reduces the overall sharpening effect, so you may have to increase the sharpening strength when using anti-bloating. Using 100% seems like a safe choice, but you can pick any value that looks good to your eyes.

Quote:
Originally Posted by mark0077 View Post
I just dug back up some old music DVDs I had and have been trying to play them but have been noticing menu navigation is causing mpc to crash, even just when doing previous / next chapter (currently testing with Queen Greatest Hits I DVD). When I switch to another video renderer like vmr9 its fine and I can't reproduce the crashes. I'm using mpc-hc BE, LAV Filters, ffdshow video + avisynth + svp. When I turn on madVR as the video renderer I'm getting the crashes where mpc will crash and windows will start reporting the issue to microsoft.
Do you have a secondary PC to test this on, just to rule out that it might be some sort of misconfiguration or corruped installation on your PC? You could try uploading those files that are responsible for the DVD menu files, then maybe other users (and nevcairiel and I) can try to reproduce it.

Quote:
Originally Posted by Patrik G View Post
Does Madvr downconvert HDR to 8bit and also srgb colorspace when you select "Convert HDR content to SDR"?

when i compare the same hdr content with madvr to my K8500 UHD player i can see that colors are more saturated with madvr but i can also see that the content has the usual 8bit bandings in skies and such.

my tv has 89% of dci p3 colorspace
isnt it possible to set custom colorpoints X and Y for HDR that madvr can convert to instead of the lower srgb colorspace?
that way i can use the full color performance from the tv with HDR.
also leave the 10bit precision alone from the content would be a great idea
madVR does what you ask it to do. If you use "convert HDR content to SDR by using pixel shader math", and if you also set the "calibration" tab to "this display is already calibrated", then madVR will convert the HDR content to the gamut and transfer function you selected in the calibration tab. That can be BT.709 or DCI-P3 or BT.2020, whatever you choose.

Banding should not occur, as long as you don't disable madVR's dithering (which you should NEVER ever do). If you still see banding then that would indicate that something goes wrong either during decoding or scaling. I'd suggest to try disabling anything DXVA in that case, as a first test.

Quote:
Originally Posted by ipanema View Post
This is probably a stupid question. I know that MadVR isn't so difficult to install, but is there a particular reason why it doesn't have a installer (such as NSIS) that might make it even easier for novices to install/uninstall ?
I don't like installers. I will offer one when madVR reaches v1.0, but probably not before.

Quote:
Originally Posted by Braum View Post
I've tried with DXVA native an also with bilinear chroma and bicubic luma, same results. The stutter seems to be shorter with bilinear/bicubic but I'm not sure.

Nice catch, the refresh rate shoud'nt be at 68hz, but should be at a steady 71,928hz. I use CRU to overclock my screen to 72hz, I've tried putting it back to 60hz and the problem persist.

The GPU is between 47 and 49 % during playback and the GPU temperature is a constant 65°C (fan speed at 25%). Jugding by these stats, the problem shoud be due to a lack of GPU power.
The decoder queue gets empty, which shouldn't have anything to do with GPU *shader* power. Can you also make a screenshot when those problem occurs with DXVA native? Is it again the decoder which gets empty in that situation?

You can try LAV nightly with "D3D11" decoding as an alternative.

Quote:
Originally Posted by Anima123 View Post
Still I am confusing on the ivtc in madVR, which I normally would like to use it to convert a 59.976 fps progressive video into a 23.976 fps sequence.

Since there's no sign with the ctrl-J screen to show the output status, all I know is that ivtc is activated and the cadence it detects is 2:2. If the source is 59.976 fps progressive video, what exactly is what I get in the end?

If it didn't work as I expected, is there a way (how-to maybe?) to get there?

Edit: In other words, maybe it would be, can madVR ivtc treat 59.976 fps progressive video as 30i sequences?
If it detects 2:2 that means that madVR wasn't able to detect a 3:2 cadence in the video. So it seems to be truly 59p and not 23p -> 59p. Are you sure it's really 23p with duplicate frames to move it to 59p?

Quote:
Originally Posted by actarusfleed View Post
I don't think so .... because one year ago I discovered it on my Oled TV (metz).
Today I performed this test an my JVC DLA-X7500 projector (precalibrated with my DVDO TPG) and the problem is the same like one year ago.

Unfortunately not.

One minute ago I've done another test....
I've checked secondary colors without using MadTPG. I've used a blu ray test pattern disc reproduced using madvr.... the secondary color are OK.
Cyan is perfect.

So I'm starting to think that MadVR TPG has some problems when it has to work with an nvidia outputting 2160p + 12bit ....

Can you do some tests ? please
If 12bit makes problems, why not sticking to 8bit instead? It's not a dramatic loss, madVR has very high dithering quality.

It's always somewhat problematic if the GPU driver does some modifications behind madVR's back. Direct3D doesn't support 12bit rendering. I can only render at 8bit or 10bit, but not at 12bit. Now if the Nvidia GPU driver stretches madVR's pixels from 8bit or 10bit to 12bit, that *usually* shouldn't produce problems, but who knows?

Hmmmm... Have you tried switching your projector (and madVR) to 0-255 instead of 16-235? Maybe it works better that way?

Quote:
Originally Posted by Blackwalker View Post
after more test same situation...
looks like nobody here have an answer for me or maybe there isnt
My impression is that you have a tendency to ignore (or accidently miss) what people say. It has been suggested to you that you should use an older NVidia driver (e.g. 385.2x), but last time you reported your Nvidia driver version, it was still 387.xx or 388.xx. Also make sure you have the OS option "HDR and Advanced Color" turned off.

Quote:
Originally Posted by ryrynz View Post
I got annoyed enough at the issue of playback not continuing properly (picture freezes and audio continues) when the screen is blanked by Windows power settings that I decided to look into it.
I found that 0.91.11 has no issues with this and 0.92.1 is where this problem begins. At first I thought it was direct3d11 related but it still happens with dxva2 and cb also.
If I switch LAV to software decoding there are no issues with 0.92.9. GTX960 using 388.31 on W10.
Good testing. But annoying test results, because the source code changes between 0.91.11 and 0.92.1 are pretty large.

What is the easiest way to reproduce this problem?

Quote:
Originally Posted by Polopretress View Post
goal is to be able to select the good memory lens of the projector depending on the ratio of the movie (2.40 or 16/9)

Using a profile group on "device" section and "screen config" sub section, it works to switch automatically the projector on the good memory as long as the projector JVC X500 is connected and recognize by its IP adress in madVR "devices/xx/properties" section.

It does not work with an Epson EH-LS10000.
I've implemented this functionality for Sony and JVC projectors, but not yet for Epson, because I don't have an Epson projector and there didn't seem to be any Epson projector user interested in this feature. If you have an LS10000 and would like to have this feature, I might be willing to implement it, but you'd have to be available for tests, ideally via email, to make things faster/easier. I don't know when I'll find the time for this, though, could be a while.

Quote:
Originally Posted by foozoor View Post
It seems that FSRCNNX is better than NGU Standard/Sharp now.
Is that your personal opinion, after having carefully compared the two algorithms with various different test videos? Or are you just reposting what some random guy on an anonymous forum posted, without providing any comparison screenshots?

Quote:
Originally Posted by Xorp View Post
Is a dedicated graphics card pretty much always required for NGU upscaling? I’m a fan of the Intel NUCs, but would they prevent me from using NGU Sharp to upscale Blu-rays to 4K, with a few other features like debanding? Does a Celeron or i3 with a Nvidia 1060 out perform a i7 NUC with no dedicated GPU?
Current Intel GPUs can't hold a candle to any decent dedicated AMD/Nvidia GPU. The Nvidia 1060 runs circles (lots and lots of them) around any Intel GPU. I don't think any Intel GPU can do NGU Sharp to upscale Blu-Rays to 4K, probably not even close. If you want to use NGU Sharp, it's really recommended to get an Nvidia 1050Ti (or higher/faster), or a comparable AMD GPU.

As mentioned by heiseikiseki, Intel seems to plan to replace their internal GPUs with AMD soon, so maybe Intel NUCs might do the job then. But the GPU chip alone won't be enough, we also need fast VRAM, and ideally 4GB+ for 4K.

Quote:
Originally Posted by cccleaner View Post
I have had no success so far since 385.69 to get to run 4k HDR content in fullscreen with this setup on windows 10 64 bit nvidia 1070. my windows crashes incredibly harsh going fullscreen. only a forced task manager (somehow) brings the OS back alive or reset and hardware reboot.
Might be an OS bug which might be fixed in a future update. Probably disabling fullscreen exclusive mode works around it?

As I keep saying: Guys, stick to Windows 8.1, if you can.
madshi is offline   Reply With Quote
Old 25th November 2017, 14:26   #47298  |  Link
MokrySedeS
I am the one who knocks
 
MokrySedeS's Avatar
 
Join Date: Aug 2009
Posts: 104
Quote:
Originally Posted by madshi View Post
Argh. It's possible, and not hard to do, but madVR would have to scan a much larger area of the video to detect this, which means reduced performance. I'm not sure if it's worth it at this point in time where the detection still runs on the CPU. Maybe when I move the algo to the GPU?
If it's going to hurt performance noticeably then it's probably not worth it.
Although maybe a "trade quality for performance" option would be a solution?
MokrySedeS is offline   Reply With Quote
Old 25th November 2017, 14:41   #47299  |  Link
iSeries
Registered User
 
Join Date: Jan 2009
Posts: 625
This will probably seem like a silly question, but how do i make proper screenshots with madvr/mpc-hc? I've tried the f5 function in mpc-hc but they are coming out much darker than the actual video when viewed in eg paint. A screenshot taken using an hdr to sdr 3dlut especially obvious, the saved image is much much darker than the video
iSeries is offline   Reply With Quote
Old 25th November 2017, 15:03   #47300  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by leeperry View Post
Looks great, love it! even with RCA@1 + quad NGU Sharp + SSIM 2D + SR
JFMI: Is that 4x NGU Sharp in one step, followed by SR? Or is it 2X NGU Sharp + SR + 2x NGU Sharp + SR?

Quote:
Originally Posted by MokrySedeS View Post
If it's going to hurt performance noticeably then it's probably not worth it.
Although maybe a "trade quality for performance" option would be a solution?
I think I'd rather delay this until the algo is moved to the GPU.

Quote:
Originally Posted by iSeries View Post
This will probably seem like a silly question, but how do i make proper screenshots with madvr/mpc-hc? I've tried the f5 function in mpc-hc but they are coming out much darker than the actual video when viewed in eg paint. A screenshot taken using an hdr to sdr 3dlut especially obvious, the saved image is much much darker than the video
Screenshots are always done in 0-255 levels, because that's how BMP/PNG files are always stored. If you have your GPU set to 0-255 and your display to 16-235, then you'll get correct results for video playback, but incorrect levels for the desktop and when watching BMP/PNG files.

HDR -> SDR is a complex topic. I'd suggest to first sort out simple SDR screenshots. Only once that's done and properly working, looking at HDR -> SDR screenshots makes sense.

For SDR screenshots, please compare F5 with PrintScreen. Is there a difference? You can configure F5 behaviour in the new madVR "screenshots" settings page.
madshi is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:22.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.