Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 16th August 2017, 10:09   #44681  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by 70MM View Post
Is there anything that can ever revert it back to default again?
Updating drivers sometimes does it - definitely does it if you do a clean install or use DDU.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 16th August 2017, 10:25   #44682  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
the reason it works with nnedi is that openCL is forcing your GPU to a high power mode while "optimal" and madVR are not good friends.

i don't know if madVR can tell the GPU to use a performance mode but this is a general problem using madVR. the default nvidia settings are not working properly with madVR so this should be made very clear for the moment.
huhn is offline   Reply With Quote
Old 16th August 2017, 10:29   #44683  |  Link
Siso
Soul Seeker
 
Siso's Avatar
 
Join Date: Sep 2013
Posts: 711
Quote:
Originally Posted by huhn View Post
the reason it works with nnedi is that openCL is forcing your GPU to a high power mode while "optimal" and madVR are not good friends.

i don't know if madVR can tell the GPU to use a performance mode but this is a general problem using madVR. the default nvidia settings are not working properly with madVR so this should be made very clear for the moment.
Well said, in my config, I always add the player in nvidia control panel and choose use maximum performance, if I select adaptive, my rendering times are changing often, with maximum performance they are low and stable.
Siso is offline   Reply With Quote
Old 16th August 2017, 10:40   #44684  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by Siso View Post
Well said, in my config, I always add the player in nvidia control panel and choose use maximum performance, if I select adaptive, my rendering times are changing often, with maximum performance they are low and stable.
Rendering times changing or being stable makes no difference, as long as you don't drop frames. Adaptive is usually just fine, as the GPU then adapts to the load conditions.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 16th August 2017, 11:38   #44685  |  Link
Siso
Soul Seeker
 
Siso's Avatar
 
Join Date: Sep 2013
Posts: 711
Quote:
Originally Posted by nevcairiel View Post
Rendering times changing or being stable makes no difference, as long as you don't drop frames. Adaptive is usually just fine, as the GPU then adapts to the load conditions.
So adaptive is the better choice, instead of prefer maximum performance?
Siso is offline   Reply With Quote
Old 16th August 2017, 11:41   #44686  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
yeah. high performance is a waste of power.
feel free to use high performance when you get in real troubles with adaptive.
huhn is offline   Reply With Quote
Old 16th August 2017, 11:41   #44687  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,344
Quote:
Originally Posted by Siso View Post
So adaptive is the better choice, instead of prefer maximum performance?
It can save on heat output (and perhaps noise) as the GPU would only run as fast as it needs to.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 16th August 2017, 11:42   #44688  |  Link
Siso
Soul Seeker
 
Siso's Avatar
 
Join Date: Sep 2013
Posts: 711
Quote:
Originally Posted by nevcairiel View Post
It can save on heat output (and perhaps noise) as the GPU would only run as fast as it needs to.
Will give it a try tonight, to see if there will be frame drops or frame delays
Siso is offline   Reply With Quote
Old 16th August 2017, 15:53   #44689  |  Link
dvd1
Registered User
 
Join Date: Aug 2017
Posts: 89
Because videos with the same profile and the same filters do not work all
good?

Last edited by dvd1; 16th August 2017 at 15:55.
dvd1 is offline   Reply With Quote
Old 17th August 2017, 00:16   #44690  |  Link
dvd1
Registered User
 
Join Date: Aug 2017
Posts: 89
With my 2GB GT 730 graphics card is it good to use madVR or the graphics card is too thick to have improvements?
dvd1 is offline   Reply With Quote
Old 17th August 2017, 14:01   #44691  |  Link
mrmarioman
Registered User
 
mrmarioman's Avatar
 
Join Date: Jul 2017
Posts: 39
I downloaded yesterday the latest nvidia drivers, 385.28, and now 4k hdr videos stutter when playing in full screen. It was fine last week... 980Ti here. Anyone else?
mrmarioman is offline   Reply With Quote
Old 17th August 2017, 14:36   #44692  |  Link
FDisk80
Registered User
 
Join Date: Mar 2005
Location: Israel
Posts: 162
Quote:
Originally Posted by mrmarioman View Post
I downloaded yesterday the latest nvidia drivers, 385.28, and now 4k hdr videos stutter when playing in full screen. It was fine last week... 980Ti here. Anyone else?
In Nvidia Control Panel set "Power management mode" to "Adaptive"

Last edited by FDisk80; 17th August 2017 at 14:39.
FDisk80 is offline   Reply With Quote
Old 17th August 2017, 15:38   #44693  |  Link
mrmarioman
Registered User
 
mrmarioman's Avatar
 
Join Date: Jul 2017
Posts: 39
Quote:
Originally Posted by FDisk80 View Post
In Nvidia Control Panel set "Power management mode" to "Adaptive"
Thanks. That did the trick.

Unfortunately, Nvidia's HDR still looks lame to me. :/
mrmarioman is offline   Reply With Quote
Old 17th August 2017, 16:59   #44694  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by Siso View Post
Here is a small sample with video 2.39:1 changing for a little to 2.85:1. Option used is zoom small black bars away.http://www3.zippyshare.com/v/BIJ9EDmS/file.html
Argh, just wanted to download it, but the link doesn't work, anymore...

Quote:
Originally Posted by clsid View Post
madshi, could you add a function to the settings API for resetting settings? Preferably with a path value, so specific subsections can be reset.
I'll add it to the list, not sure right now how difficult it will be to add, though, which has a direct influence on how quickly I'd implement it.

Quote:
Originally Posted by HomeY_ View Post
Also when i enable HDR on the TV HDMI port, i get a lot of flickering pixels 'snow' and TV seems to be switching between HDR and standard picture mode. Although the HDMI cable lists it can do 2160p@60Hz, i'm wondering if it's HDR capable, but i've got a new cable on it's way which should be delivered in a few hours, so hopefully that fixes those issues.
That does sound like either a cable issue, or one of the HDMI ports involved (in or out) is shaky.

I'm usually using extended, but dual monitor setups can be tricky.

Quote:
Originally Posted by ABDO View Post
waifu2x in 2d Anime can tottly clean low bitrate compresing artifacts ، but much much much slower than ngu.
please take looke at some dirty comparison images here
http://www.mediafire.com/file/zkq3jw...creenshots.rar
I may look into compression artifact removal at some point in the future, but probably not too soon.

Quote:
Originally Posted by TheShadowRunner View Post
A quick word to tell you I'm still looking forward to the test-build madshi (with "* last video frame is now remembered for 2 seconds when stopping graph" disabled or optional, to test if it's the change responsible for the DVD playback crash on XP).
(no hurry, just worried you'll forget ^^
Yes, yes...

Quote:
Originally Posted by krmit View Post
That's a crash in XySubFilter. Not much I can do about it, unfortunately. You can try a different subtitle renderer.

Quote:
Originally Posted by ogakul View Post
I get dropping and repeating frames if I use custom display modes like 2160p23.
Try optimized custom modes, created by the next madVR build. If that doesn't help, upload screenshots of your Ctrl+J OSD somewhere, so we can look at the stats, maybe that helps figuring out why you get drops and repeats.

Quote:
Originally Posted by jmonier View Post
For me under Win 8.1 64 things are a little different. It switches into HDR fine, INCLUDING under Full Screen Exclusive with the latest (384) Nvidia drivers.

However, it will NEVER switch back to SDR. My LG OLED will switch to SDR when I switch inputs but it will go back to HDR when I switch back to the PC. The only way I can get back to SDR with the PC is to re-boot.
The only thing that makes sense to me is if somehow your media player might be crashed but still lurking in the background. madVR automatically restores SDR mode when your media player closes - or when madVR is properly freed. If you can reproduce this problem, close your media player and double check that it's not listed in the task manager, anymore. If it still is, terminate it. Does that help?

Quote:
Originally Posted by mv View Post
Can you please comment how HDR meta is used when playing a HDR video? Are HDR mastering primaries/luminance from the video sent to the display or some static values are used?
The video metadata is sent to the display.

Quote:
Originally Posted by mv View Post
Any chance we'll see HDR support in madTPG in near future?
Yes.

Quote:
Originally Posted by nlnl View Post
Is there any reason for upgrade Windows 7 to Windows 8.1 for using Madvr?
Madshi used to say that 8.1 is better then 10. And what about 7?
Windows 8.1 has many advantages over Windows 7:

1) Much better desktop composition implementation.
2) Much better multi monitor handling.
3) APIs for 3D Blu-Ray playback.
4) APIs for Direct3D DXVA decoding & processing (see next madVR build).
5) Faster OS kernel.

Windows 10 is currently not as stable as Windows 8.1. The only advantage that I see with Windows 10 over Windows 8.1 is that fullscreen windowed mode supports 10bit output, if the GPU driver supports it.

Quote:
Originally Posted by njfoses View Post
Has anything changed lately with madTPG?
No.

Quote:
Originally Posted by P.J View Post
It's the latest driver version for Win10 x64. madVR is so resource hungry.
With default setting, it blows up my GTX960 while playing 4k 10bit HDR.
24fps or 60fps content? That's an important question because 60fps requires 2.5x as much GPU power as 24fps, obviously. Anyway. Try the next madVR build with a nighly LAV build for improved HEVC/HDR decoding. That might help.

Generally, a blue screen of death indicates a problem with either the hardware or a driver. A simple user software like madVR should never be able to produce a blue screen. Even if madVR drives your CPU and GPU at max level for hours, your PC should still run stable and without a blue screen. It might stutter, though.

Quote:
Originally Posted by DragonQ View Post
Since I swapped my TV for a different model, I can't get the composition rate to match the display's refresh rate when using MadVR (in both MPC-HC and Kodi).
Which is one of the reasons why I recommend upgrading to Windows 8.1. Windows 7 has big problems with desktop composition. You should be able to use FSE mode to workaround the problem, though. I'd suggest you use the madVR refresh rate changer instead of the MPC-HC/Kodi one.

Quote:
Originally Posted by edwdevel View Post
I am evaluating (benchmarking) madvr with the lowest Nvidea 10 series GPU,
the 1030, and ran into a problem trying to get DXVA downscaling option to
work using a 4k HEVC demo clip, "LG Chess". Here is a screenshot of the
clip using bilinear downsizing:

http://jpegshare.net/f1/d5/f1d5dda1f...94679.jpg.html

The same clip, when using DXVA downsizing:

http://jpegshare.net/48/cd/48cd09006...fab88.jpg.html

As you can see. the DXVA clip does show that downsizing occurs, but there
seems to be a problem with the YCrCb <-> RBG conversion at output. You can
see the madvr downsizing screen on the right, and a GPUShark window below.
Perhaps I haven't set somthing correctly to get this to work, or it may be
a bug. I'm hoping someone may know what the problem is at first glance.

I'm under windows 7, (near) latest LAV filters, Nvidia drivers, madvr, and
MPC-BE.

(Also as you can see, using DXVA downsizing is hugely more efficient than
even simple bilinear downsizing)
I'll have to double check this with some non-HDR content. I'm not sure if it's a GPU driver issue, or a madVR bug.

Generally, D3D9 DXVA is rather limited. It doesn't understand HDR, and it might also generally have some trouble with 10bit processing. nevcairiel and I have been working on adding support for D3D11 DXVA, but currently I haven't implemented D3D11 DXVA scaling yet, and also it will also require Windows 8.1 or newer.

Of course testing with 60fps HDR demos is extra hard on the GPU. Real content is more likely to be 24fps, which is 2.5x less demanding on the GPU.

Quote:
Originally Posted by austinminton View Post
I tried using the madvr toggles in the rendering->stereo 3D tab for 2d/3d content. The problem is that the os 3d setting does not correctly mirror the nvidia stereo 3d settings. OS 3D does disable nvidia 3d, but nvidia 3d does not get enabled when OS 3d is enabled. This might be new behaviour in windows 10 (creators update).
This sounds like a driver bug to me. I'd suggest that you report this to Nvidia. It used to work with older drivers.

Quote:
Originally Posted by austinminton View Post
I have scripts to enable stereo 3d in nvidia, and I tried using profiles to enable/disable 3d, but it seems running batch files on profile enable is broken?
It's not implemented yet... It's on my to do list...

Quote:
Originally Posted by andybkma View Post
Since there is no "OFF" option (or at least one that I can find) for mVR image downscaling, chroma upscaling & image upscaling, can you please tell me which option(s) for these three sharpening functions would most simulate OFF? I am using mVR with a low power GPU on a laptop and am soon going to add in a Darbee to the mix because I need to reduce all the taxing GPU/CPU load I have been putting on my laptops with a projector. But of course I still want to use mVR because of smooth motion and the various other awesome features, so when I add the Darbee I would like to start at a "normal", non mVR sharpening operations (for want of a better term) and increase sharpening slowy step by step up from that point and see how it goes...
The other users already wrote many comments, but let me (try to) be extra clear here:

You need to very strictly differ between 1) sharpening, 2) chroma upscaling, 3) image up/downscaling. Because all the Darbee does is 1). The Darbee does not do 2) or 3). And madVR by default does 2) and 3) but not 1).

Generally, chroma upscaling is always needed (for typical video sources), but it's not very important, so if you lack GPU power, I'd suggest to switch chroma upscaling to "bilinear". Image up/downscaling are very important, but they're only active if the video resolution differs from your TV resolution. If you play a 1080p movie on a 1080p display, no image up/downscaling is used, regardless of madVR settings (let's ignore exotic settings like supersampling).

If you want to save GPU power and do processing externally, you'd have to configure madVR to always output each video in its native resolution, so no image up/downscaling is needed. E.g. you'd switch to a 1080p display mode for 1080p movies, and to a 720p display mode for 720p movies etc. This becomes problematic for SD movies, though, because GPUs often don't support such low resolutions as output modes. Anyway, if you want to go this way, a Darbee will not help. You'll need something which is able to do image up/downscaling. So you could get a Lumagen instead.

Quote:
Originally Posted by Sunset1982 View Post
I created some profiles and rules for madvr. Now I want to create an extra profile for low quality sources, which I can switch to with a keyboard shortcut.

Is there a way to show the active/chosen profile name in madvr's stats window? If not, will it be integrated in a future version?
I think if you press a keyboard shortcut, madVR shows the profile name for 3 seconds. I don't currently plan to add the active profile names in the stats window, because there could be many many such names, and it would make the OSD much larger.

Quote:
Originally Posted by MrNuka View Post
MPC-HC suddenly takes 20+seconds to open a file, status is "opening...", it's fine if I don't use MadVR as renderer.
This happened literally over night.
I already updated my GPU driver, reinstalled Madvr, installed an older version, reset all settings. Could it be my HDD? Got a gtx 1070 and an i7 6700k
Something must have changed? Not sure what. I don't recall anybody else reporting this specific issue, so I don't really have a good idea how to debug it. Well, one trick might be to wait until it's stuck for 5 seconds, then press Ctrl+Alt+Shift+Pause/Break. Maybe this will create a freeze report on your desktop? If so, you can upload it somewhere for me to look at.

Quote:
Originally Posted by oddball View Post
I made an inquiry with AMD regarding their HDR switching method. Apparently you CAN use their API for on-the-fly HDR switching.

'Thanks for the email.

We do provide the AMD GPU Services (AGS) library on GPUOpen that ISVs can use to switch HDR displays into HDR mode on-the-fly from their application.

This does not rely on the Windows 10 Creators Update “HDR and advanced color” toggle in Display Settings - we’ve supported this since last year from Windows 7 and onwards with the 16.40+ drivers.

http://gpuopen.com/gaming-product/am...s-ags-library/
Great find - thank you!

I really wonder why this is in AGS instead of ADL. IMHO it belongs into ADL. But anyway, I'm just glad there is such an API. Doesn't matter too much where it is.

Quote:
Originally Posted by GITY6 View Post
For example, once an italicized or bolded subtitle (from an .srt file) appears, the subtitle position is reverted to the active video area, and all subsequent subtitles (regardless of formatting) are positioned there until the application is restarted.

I'm using MPC-HC, and my subtitle renderer is XySubFilter. If anyone has dealt with this issue, advice would be appreciated
It's on my to do list to look into, but it's pretty hard for me to do because it's an issue with XySubFilter...
madshi is offline   Reply With Quote
Old 17th August 2017, 17:04   #44695  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by PurpleMan View Post
Can anyone help explain why I'm getting dramatically different performance with kodi-dsplayer+madvr vs mpc-hc+madvr ?

kodi-dsplayer+madvr+lavfilters (external) = >50ms rendering time
mpc-hc+madvr+lavfilters (external) = ~15ms rendering time
In situations like these it's always helpful to see screenshots of the madVR OSD (Ctrl+J) from both players, ideally both with the same window size (e.g. fullscreen). You may have to disable FSE mode to make screenshotting possible in fullscreen state.

Quote:
Originally Posted by BaBydnT View Post
Hiii i recently install Lavfilters for improve quality on my anime videos 720p/1080p and i have this issues:

High load GPU: https://puu.sh/wQioD/ebe35e8db9.png
Drop frames 10/s in 1080p, in just 2min i lose 987 frames: https://puu.sh/wQipg/7d83ecf087.jpg

When i play 720p if i have chroma NGU AA High i dropped frames 3-5/s, i dont any drop frames if i use chroma bicubic 75

My pc specs: AMD FX8350 - GTX960 4GB - BenQ GL2460 - 8GB Ram 1600

Thanks and so sorry for my english
Your rendering times are very high at 44.73ms. And this is with a GTX960? Seems weird to me. Are you using some fancy settings that aren't visible in the OSD, e.g. error diffusion dithering?

Quote:
Originally Posted by ryrynz View Post
Madshi, chances of SDR to HDR upconversion?
The whitepaper is MIA, wonder if this can be done in real time, likely not yet.. Maybe some years off.
Not planned for now.

Quote:
Originally Posted by bcec View Post
Anybody having issues with 3D playback? When fullscreen, my 1080@24p frame-packed playback is playing as if contrast is cranked up 1000%. Only chroma upscaling is enabled.

EDIT: Turning off OS level HDR setting solved the issue. For some reason, Full Screen exclusive 3D (1080@24) + OS HDR turned on causes extreme contrast and clipped highlights and shadows with madVR. Maybe a bug?
The OS level HDR setting should always be set to off. It's a bad setting - even for HDR content.

Quote:
Originally Posted by Dorohedoro View Post
I'm using a custom 23.974hz refresh rate but with my projector (x5000) I can still see a distracting flicker/judder on bright scenes if there is panning
FWIW, have you tried a custom 48/1.001Hz or 72/1.001Hz mode? I'm not sure if the JVC supports it, but it might be worth a try...

Quote:
Originally Posted by Oguignant View Post
Hi, I need a little help here ...
This happened to me several times. If I advance the movie many times, the render times are increased (from 30ms to 75 or 90ms) and never again are as at the beginning (For example the normal render time is 24ms in fullscreen for this movie and it increases to 90ms or more than 100ms).

This happens with luma quadrupling set to direct quadruple -> very high for 720p movies:

I try other versions of Madvr, also re install nvidia drivers, nothing works. The only way is to re start Windows or disabling luma quadrupling. Anyone have an idea what may be going on? I already tried to check basic things, but I can not find what the problem may be. I do not know if I explain well what the problem is. Any ideas?
Weird. When you have this situation again, can you please not restart Windows, but instead create a debug log and upload it somewhere for me to look at? Ideally create the debug log only in that situation when the problem occurs, and try to keep it short. Thanks.

Btw, have you checked if maybe the GPU clocks are changing? In the Nvidia Control Panel try setting "Power management mode" to "Adaptive".

Quote:
Originally Posted by clsid View Post
What does "reset gpu" do?
It restarts the GPU driver, which is one way to make sure newly added/modified custom resolutions/timings actually become effective.

Quote:
Originally Posted by sauma144 View Post
I thought this would be a one-click and forget setup automatically done by madVR itself. Maybe a video tutorial would be useful for noobs.
Unfortunately a completely automatic solution isn't possible, but I'm trying to make it as easy as possible.

Quote:
Originally Posted by Fullmetal Encoder View Post
Forgive me, but how would this ability help those of us with monitors that can only operate at 60hz? I'm a little worried that this might fry my monitor.
Frying should not occur. Most monitors check for valid ranges and refuse to show a signal they don't like. Also, monitors should communicate their max pixel clock ability via EDID and madVR tries to not exceed that.

There's still a potential gain to be had for 60hz only monitors. E.g. my very old computer LCD monitor only supports 60hz in theory. But VESA DMT asks for monitors to support a 5% pixel clock range. So anything between 57hz and 63hz should work. Practically that means you should at least be able to get perfect 59.940hz and 60.000hz custom tmings for 59fps and 60fps content. That won't help with 24fps content, of course. But still, it's better than nothing.

FWIW, my old LCD monitor even supports a custom 72hz mode. However, even through it does, it doesn't play 24fps content smoothly. Seems the LCD monitor internally still works at 60hz. So it's always important to not only test if you get a proper image, but also if playback is really smooth.

Quote:
Originally Posted by leeperry View Post
I guess next step is an audio renderer?
No.

Quote:
Originally Posted by SweetLow View Post
Intel HD4000, Win 7x64, latest driver (10.18.10.4653)
It's working, but old problem of driver is not solved - there isn't possibility to set different 23/24, 29/30 and 59/60 DTD. Driver use only one of them and make some weird calculations of pixel clock.
Yes, these are some of the GPU driver bugs I stumbled over. Obviously my tool can't fix driver bugs. I do have an Intel contact, though, to whom I will report these issues. I *hope* they will get fixed at some time in the future, but I can't promise it, obviously.

Quote:
Originally Posted by SweetLow View Post
what is the source of pixel clock in editor for "standard mode, unknown timing details" and known EDID parameters for this resolution and rate? This isn't EDID pixel clock in some cases for both rates in pair (59/60 for example).
Pixel clocks are never 100% reliable, they vary due to hardware tolerances and temperatures. As mentioned earlier, VESA DMT standard asks monitors to support a whopping 5% pixel clock tolerance. So madVR interprets EDID pixel clock data losely, and may do (very small) adjustments to improve video playback. Generally, EDID usually contains pixel clock for 60, but not for 59. So the pixel clock for 59 must be calculated from the 60 pixel clock. But it's not always this way. E.g. my old LCD monitor rather seems to have 59 timings but no 60 timings in the EDID, which is rather weird.

Quote:
Originally Posted by zapatista View Post
with my current pc specs, and using madvr, should i be able to downscale a 4k (YCbCr) video file to 2k @ 10bit RGB with decent framerates ? (i am sending it via hdmi to my tv which can accept 1080p RGB @ 10 and 12 bit ) or is my pc gfx card not able to cope ?
Please make a screenshot of your OSD (Ctrl+J) and upload it somewhere (don't attach to this forum) for us to look at. Then maybe we can make some suggestions. Are you using madVR default settings? Or did you change anything?

Quote:
Originally Posted by mparade View Post
Is it possible for madVR to automatically recognize my Full SBS 3D content (3840x1040p, 3840x800p etc.) as 3D in some way?
It's on my to do list. FWIW, do those MKV files have a 3D marker in the MKV header somewhere? Or am I supposed to guess the 3D based on the resolution alone?

Quote:
Originally Posted by mrmarioman View Post
I'm using MPC-HC with the latest MadVR. I have a 980Ti, so I'm using Nvidia's HDR API.
I disabled HDR on Windows 10 and FSE mode on madVR. And it works, kinda. The TV detects the HDR signal and switches dinamically from HDR to SDR when opening/closing MPC HC. But the quality of the HDR is underwhelming --it looks better than SDR but not close to what I see when using the TV's own player. I have my PC set 10 bit 4:2:2 and MPC HC as well. Not sure if I'm missing something.
You should set your GPU control panel to RGB Full/0-255. Then in madVR you can choose 0-255 vs 16-235. Maybe that helps? Generally, in passthrough mode playback quality should be more or less identical to a standalone UHD Blu-Ray player. There will be some differences, due to chroma upscaling, and sending images in RGB instead of YCbCr, but the visible differences should be small.

Quote:
Originally Posted by leeperry View Post
actually deringer removes so many artifacts from the original picture that the "add grain" option does come in very handy but even a strength of 1 is already pretty strong, any chance you could like half the strength of every step and go up to 8 instead of 4? I don't see the current 4 being useful to anything IME, 0.5 or 0.66 would hit the spot
I'm working on some pretty big changes atm, really no time for such small tweaks atm.

Quote:
Originally Posted by Manni View Post
- It only accepted the EDID values for three rates (23, 25 and 50). It rejected all the others (error saying GPU didn't accept for unknown reasons). That's not too bad for me, as 99% of my content is Bluray/PAL DVD, but would be a problem for people with lots of NTSC or TV content.
Currently, my Nvidia drivers rejects virtually everything. Very annoying. Nothing I can do about it, it's a driver issue. I've reported it to an Nvidia engineer, though, hopefully he'll get it fixed in a future driver version.

Quote:
Originally Posted by Manni View Post
I tested before/after with 23p (98% of my content) and the improvement is significant. I go from one frame repeated every 5mn to one frame drop every 50mn.
I'm glad to hear that. But to be honest, there wasn't even supposed to be any improvement at all! The first step of the whole process (which you just took) is just meant to define a known set of timing parameters. The improvement is supposed to be coming after having actually measured video playback based on the new timings, and then following that an optimized new timing set.

Basically what this means is that Nvidia's interpretation of the EDID information is pretty bad (IMHO).

Quote:
Originally Posted by sauma144 View Post
If automating custom resolution creation and optimization is possible, why not doing the same for everything else with a global "let madVR decide" based on the current GPU specs and if it's on battery or not for laptops?
Because that's not easy to do, and my day only has 24hours.

Quote:
Originally Posted by foozoor View Post
Hello madshi, I am wondering if you checked the mpv area these days.

There were a lot of huge improvements:

- Nnedi3 shaders based on MPDN ones are really faster than on madVR.
When MPDN introduced those, they were sometimes faster and sometimes slower, depending on GPU model and neuron count, so I didn't bother implementing them yet. Maybe this changed now and newer GPU models might prefer the MPDN shaders? I might look into this in the future.

Quote:
Originally Posted by foozoor View Post
Igv's single-pass AdaptiveSharpen variant is really better and faster than madVR one.
I had a quick look at Igv's AdaptiveSharpen variant a couple weeks ago and liked it, but I planned to do some deeper comparisons before deciding to switch to it.

Quote:
Originally Posted by foozoor View Post
Igv's SSimSuperRes variant based on MPDN one is totally destroying SuperRes upscale enhancer in madVR.
Hmmm... When Shiandow introduced SSimSuperRes, I compared it to my SuperRes algorithm, and I clearly preferred mine. It's possible that Shiandow's latest SSimSuperRes algo has improved noticeably, maybe also Igv did some further improvements. I'll double check that. But I will check with real world content before making any decisions. Last time I checked, SSimSuperRes looked better in artificial test patterns and in some images, but it produced clearly more bloated images, which is a look I totally hate. My SuperRes variant rings a bit more than SSimSuperRes, but it doesn't bloat. So I will have to compare carefully before I make up my mind.

Quote:
Originally Posted by foozoor View Post
Bjin's RAVU prescaler/doubler is easily 4x faster (test/ravu-r3-smoothtest1.hook) than NGU-AA High with same/better result.
I'm occasionally browsing mpv threads. In those I've seen a whole of 1 image where RAVU looked better than NGU-AA, and I'm not even sure if that 1 image was exactly the same frame. Generally, it seems RAVU was heavily optimized for low-res low-quality Anime content, which of course means it shines there. NGU-AA is not optimized for any specific kind of content, so it might not look as good with this one very specific type of content, but it looks quite nice with all sorts of content. I've seen some RAVU images which seem to suggest that RAVU has rather high ringing artifacts with "normal" content. So I'm not convinced yet that RAVU is overall a competitor to NGU-AA and NNEDI3. But we'll see. I'm not opposed to add RAVU to madVR (if the license allows it), provided it provides true value. Right now I wonder if it wouldn't make more sense to create an NGU variant which is optimized for the same type of content RAVU was optimized for, to make it a fair comparison.

NGU and NGU-AA "High" and "Very High" quality levels are not fully optimized for speed yet. There's some potential to make them run faster. I don't expect major performance improvements for "Low" and "Medium" quality, though. Except maybe for Polaris which seems to generally dislike NGU for some reason. Maybe I can workaround that, but I'm not sure.

Quote:
Originally Posted by Dorohedoro View Post
My god, this is a dream come true, please show me a donation link, I mean it.
Quote:
Originally Posted by cork_OS View Post
Maybe if madshi would accept donations for madVR, he could devote more time to it's development.
Thanks for the kind offer, guys! As I mentioned before, I want to wait for madVR v1.0, before accepting any kind of donations. I'be be thankful if you could remember your willingness to donate for that time...
madshi is offline   Reply With Quote
Old 17th August 2017, 18:04   #44696  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
madVR v0.92.0 released

http://madshi.net/madVR.zip

Code:
* added new "display modes" -> "custom modes" settings tab
* added support for native D3D11 DXVA hardware decoding (needs nightly LAV)
* added support for outputting 10bit in fullscreen windowed mode (win10)
* added optimized "let madVR decide" HDR configuration option
* added support for AMD's private HDR switching API
* added workaround for make Nvidia's private HDR switching API work better
* added full-size EDID block reading (256 bytes instead of just 128)
* added extended EDID parsing
* improved frame drop/repeat estimates for xx/1.001 hz modes
* fixed: deinterlacing of P010 software decoded videos was broken
1) new "custom modes" feature:

This feature allows you to create custom modes and timing overrides with Nvidia, AMD and Intel GPUs. I'm using private APIs from all 3 GPU manufacturers, so we don't have to use EDID overrides. This feature is meant to replace Reclock (or similar other audio renderers). Which means: Perfectly smooth video playback, without audio quality loss (due to resampling)! Unfortunately, the GPU drivers from all 3 GPU manufacturers have bugs in this functionality area, so it's not all working perfectly yet. Hopefully, drivers will improve in the future.

Thanks to @hannes69 for his help in developing this feature.

Here's a custom mode tutorial:

http://madvr.com/crt/CustomResTutorial.html

2) native D3D11 DXVA hardware decoding:

This is a new feature nevcairiel (thanks!) and I have been working on. Basically it "replaces" the old native D3D9 DXVA hardware decoding with a new D3D11 solution. (Well, D3D9 DXVA is still available.)

- image quality is now always identical to software decoding (unlike D3D9 DXVA)
- now AMD 10bit HEVC hardware decoding works, too
- might be ever so slightly faster (with potential for future improvements)
- self-made decoder<->renderer interface should be more stable
- probably requires Windows 8.1 or Windows 10
- I've not implemented DXVA scaling yet
- I've not implemented DXVA deinterlacing yet

Software decoding still has a small advantage, though: The madVR features "forced film mode" and "zoom control" currently only work with software decoding (or copyback).

3) 10bit in fullscreen windowed mode

Microsoft told me that if video playback (or games) run in borderless fullscreen, DWM is bypassed in Windows 10. Microsoft calls this "direct scanout". This might be almost as good as true FSE mode, without the typical FSE disadvantages. During "direct scanout", 10bit is supported even in windowed mode, even if "HDR and Advanced Color" is turned off.

madVR 0.92.0 now automatically switches between 8bit and 10bit in windowed playback, depending on whether playback is borderless fullscreen or not. This feature requires Windows 10.

4) "let madVR decide" HDR configuration

The new default HDR configuration is to "let madVR decide". Practically this means:

- If "HDR and Advanced Color" is on, your TV will always be driven in HDR mode. Consequently, madVR leaves HDR content untouched and just passes the HDR metadata to the display. SDR content will be upconverted by the OS/GPU to HDR. I don't recommend to turn "HDR and Advanced Color" on - ever!
- If "HDR and Advanced Color" is off, but if your GPU drivers and display support HDR, then madVR will use private GPU APIs (AMD and Nvidia only) to switch your TV to HDR dynamically.
- In all other situations, madVR will convert HDR content to SDR, using reasonable settings which aren't too demanding but still produce fair image quality.

Basically, the "let madVR decide" option should always produce correct looking SDR and HDR images, and tends to prefer HDR passthrough, where possible.

5) support for AMD's private HDR switching API


AMD's private HDR switching API seems to work well, but it's more limited compared to Nvidia's API. AMD's API only works if there's active fullscreen D3D11 video playback (or gaming), which also has to be 10bit! Otherwise, the AMD API will do nothing. Consequently, madVR will switch to passthrough only in the moment when you go fullscreen, and only if you configured madVR to use 10bit D3D11.

Nvidia's API also works in windowed mode, and also with 8bit D3D11, even with D3D9.

Besides, there are 2 new DLLs now: "amd_ags_86/64.dll". These are needed for AMD's HDR dynamic switching. If you don't need this feature, you can delete the DLLs.

Last edited by madshi; 17th August 2017 at 22:20.
madshi is offline   Reply With Quote
Old 17th August 2017, 18:08   #44697  |  Link
dvd1
Registered User
 
Join Date: Aug 2017
Posts: 89
Quote:
Originally Posted by dvd1 View Post
With my 2GB GT 730 graphics card is it good to use madVR or the graphics card is too thick to have improvements?
Nobody knows anything?
thank you
dvd1 is offline   Reply With Quote
Old 17th August 2017, 18:09   #44698  |  Link
Manni
Registered User
 
Join Date: Jul 2014
Posts: 942
Quote:
Originally Posted by madshi View Post
Currently, my Nvidia drivers rejects virtually everything. Very annoying. Nothing I can do about it, it's a driver issue. I've reported it to an Nvidia engineer, though, hopefully he'll get it fixed in a future driver version.

I'm glad to hear that. But to be honest, there wasn't even supposed to be any improvement at all! The first step of the whole process (which you just took) is just meant to define a known set of timing parameters. The improvement is supposed to be coming after having actually measured video playback based on the new timings, and then following that an optimized new timing set.

Basically what this means is that Nvidia's interpretation of the EDID information is pretty bad (IMHO).
Glad to hear I got lucky for once!

It's entirely repeatable though, I did some tests with 378.92 and 385.28 (clean install), and every time with standard resolutions I get 3-5mn before a repeat, and after simply applying the EDID values I get up to 2 hours+ before a drop.

I was wondering why the custom resolutions in the tool are for 4K/UHD only. Although I play 3D at 23p as well, the benefit from the 4K/UHD@23 custom resolution doesn't translate, and I get a repeated frame every 3mn in 3D. It would be great if you could add 1080p resolutions too, so that we could reap the benefits of these custom res/timings in 3D as well.

Apart from 3D settings that aren't sticky (madVR doesn't always switch to 1080p23FP from 4K23 because either the OS or the driver 3D is disabled, so it ends up in 1080p23/2D), everything works pretty well and I find the upgrade from the HD7870 to the 1080 Ti to be very beneficial, especially for MadVR.
__________________
Win11 Pro x64 b23H2
Ryzen 5950X@4.5Ghz 32Gb@3600 Zotac 3090 24Gb 551.33
madVR/LAV/jRiver/MyMovies/CMC
Denon X8500HA>HD Fury VRRoom>TCL 55C805K

Last edited by Manni; 17th August 2017 at 18:29.
Manni is offline   Reply With Quote
Old 17th August 2017, 18:19   #44699  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by dvd1 View Post
With my 2GB GT 730 graphics card is it good to use madVR or the graphics card is too thick to have improvements?
The GT 730 is rather old and only has DDR3 memory. It seems like a fairly slow GPU, by today's standards. madVR might still run smoothly, maybe you need to tune down some settings a bit, but it depends a lot on which movie resolution and frame rate we're talking about, whether you need upscaling or not etc.

Quote:
Originally Posted by Manni View Post
Glad to hear I got lucky for once!

It's entirely repeatable though, I did some tests with 378.92 and 385.28 (clean install), and every time with standard resolutions I get 3-5mn before a repeat, and after simply applying the EDID values I get up to 2 hours+ before a drop.
I'm not sure what Nvidia is doing. Maybe they follow the EDID values to the letter, while madVR is recalculating the pixel clock to the most accurate value possible.

Anyway, with v0.92.0 you can try actually measuring and optimizing the timings even further. That should easily get you into the "one frame drop/repeat every 1.x days" range, at least. Tutorial to follow soon.

Quote:
Originally Posted by Manni View Post
I was wondering why the custom resolutions in the tool are for 4K/UHD only.
Have you tried unchecking the "show native res modes, only" checkbox?
madshi is offline   Reply With Quote
Old 17th August 2017, 18:24   #44700  |  Link
MS-DOS
Registered User
 
Join Date: Sep 2012
Posts: 77
Quote:
Originally Posted by madshi View Post
madVR v0.92.0 released
Getting an error message when starting any video now, then MPC closes. Also, getting two identical "display modes" items under devices -> *my monitor*, changing anything and applying there causes madVR settings to permanently hang. I'm using latest nightly's of MPC-HC and LAV.
MS-DOS is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 00:31.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.