Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 12th July 2015, 19:55   #31701  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Mostly agreed.
I really like the effect of AS as an UR with low values (e.g. 0.2) for NNEDI3. Everything's still well reconstructed, but that softness disappears for a huge amount. Especially if sharpened after each 2x upscaling step. Thanks to its adaptive nature, it doesn't oversharp already sharpened areas and there is very little aliasing introduced.
Maybe this works a bit better with naturalistic content than with artificial one due to that thickening of black lines. But without sharpening, those lines are very soft with NNEDI3 though.
aufkrawall is offline   Reply With Quote
Old 12th July 2015, 23:33   #31702  |  Link
XMonarchY
Guest
 
Posts: n/a
I still get a ton of presentation glitches on my 120Hz monitor, even if I set it to 60Hz. The same exact settings work perfectly fine on my TV @ 23Hz. I don't get it. Average rendering times are identical on both TV and monitor - about 27-28ms (same as max stats). I can reduce presentation glitches on 120Hz monitor by enabling "present a frame for every VSync", but then still occur (1 every 10 seconds or so). However, enabling this option during playback on my TV @ 23Hz creates dropped frames and presentation glitches. Setting "present several frames in advance" to 1 also helped to reduce presentation glitches (1 every 15-30 seconds). Using less GPU-taxing settings made no difference.

The second issue I get on my TV is that at times when I pause playback and resume it, I get severe playback stuttering, which I can fix by exiting fullscreen exclusive mode to window mode and then going back to fullscreen exclusive mode. That does not happen on my 120Hz monitor though...

I reset madVR settings to default, uninstall it as admin, restarted, install madVR as admin, reset settings back to default, and applied settings I use. That did not help.

EDIT: Could these issues because I set "Maximum Pre-Rendered Frames" to 1 in NVidia Control Panel? I set it so because it reduces stuttering associated with using Borderless Window mode in PC games.

Last edited by XMonarchY; 12th July 2015 at 23:45.
  Reply With Quote
Old 12th July 2015, 23:42   #31703  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,336
High Refresh Rate screens can be a bit fiddly. You may need to increase the queue sizes quite a bit to accomodate for the higher refresh rate, that tends to help for me to reduce the glitches, and "present a frame every vsync" is pretty much mandatory.
Personally, its not like I see the glitches, it just piles up on the counter.

PS:
There will never be settings that work 100% perfect on every system, so having to change a setting here or there depending if you're playing 23.976 on a 120Hz screen, or having it play 1:1 on a 23p TV is not uncommon.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 13th July 2015, 00:18   #31704  |  Link
kasper93
MPC-HC Developer
 
Join Date: May 2010
Location: Poland
Posts: 586
@madshi: Something weird is going on. If I make MPC-HC window size equal to screen size madVR will go into exclusive mode and position window in center even though it weren't before. If I open context menu to disable exclusive mode it window go back to its original place. Also seekbar doesn't work in this "exclusive mode" probably it is not reported correctly.

To reproduce open any file in MPC-HC. And stretch the window to display size (not work area size). Basically this means to Make MPC-HC max size. To do that you need manual grab window borders and stretch it.

I will look into details tommorow. But it looks like madVR have "if (windowSize == displaySize) GoToRetardedExclusiveMode();" logic
kasper93 is offline   Reply With Quote
Old 13th July 2015, 00:24   #31705  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by nevcairiel View Post
High Refresh Rate screens can be a bit fiddly. You may need to increase the queue sizes quite a bit to accomodate for the higher refresh rate, that tends to help for me to reduce the glitches, and "present a frame every vsync" is pretty much mandatory.
Personally, its not like I see the glitches, it just piles up on the counter.

PS:
There will never be settings that work 100% perfect on every system, so having to change a setting here or there depending if you're playing 23.976 on a 120Hz screen, or having it play 1:1 on a 23p TV is not uncommon.
Erm, but enabling "present several frames in advance" and setting "how many video frames shall be presented in advance" to 1 helped a lot. Anything higher than 1 and I get more frequent presentation glitches.
  Reply With Quote
Old 13th July 2015, 00:31   #31706  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,336
Quote:
Originally Posted by XMonarchY View Post
Erm, but enabling "present several frames in advance" and setting "how many video frames shall be presented in advance" to 1 helped a lot. Anything higher than 1 and I get more frequent presentation glitches.
Don't set pre-presented frames in the NVIDIA control panel then, it messes madVR up quite considerably.
Luckily the NVIDIA panel has per-app settings!
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 13th July 2015, 00:42   #31707  |  Link
Magik Mark
Registered User
 
Join Date: Dec 2014
Posts: 666
Quote:
Originally Posted by nevcairiel View Post
Don't set pre-presented frames in the NVIDIA control panel then, it messes madVR up quite considerably.
Luckily the NVIDIA panel has per-app settings!
Thanks for this tip. Are there any other settings that would degrade the performance of madvr or any other renderers in the Nvidia control panel? Is it better to just tick "Let 3d application decide" option?
Magik Mark is offline   Reply With Quote
Old 13th July 2015, 01:05   #31708  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,889
Quote:
Originally Posted by Magik Mark View Post
Thanks for this tip. Are there any other settings that would degrade the performance of madvr or any other renderers in the Nvidia control panel? Is it better to just tick "Let 3d application decide" option?
just leave them at default and you are fine.
huhn is offline   Reply With Quote
Old 13th July 2015, 01:05   #31709  |  Link
aufkrawall
Registered User
 
Join Date: Dec 2011
Posts: 1,812
Vsync, prerenderlimit, power management mode and Optimus settings are the only driver-settings that should affect madVR. Everything should be kept at driver default (app-controlled) for the profile of your media player.
If you change settings globally, you have to set them back for each profile.
aufkrawall is offline   Reply With Quote
Old 13th July 2015, 03:10   #31710  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by leeperry View Post
If you were using it in combination with any kind of ED, run a gray ramp and you'll both be be horrified and happy to have it unchecked now
I realized why I had it checked, and it wasn't because of render rate stats, but present stats. With "don't use linear light for dithering" checked and the video full screen my present rates are in the 3ms area, but with it unchecked and the video full screen they are in the 11ms area.

Edit: Windowed Full Screen (D3D11) runs with the 11ms presents, but if I run in D3D11 Exclusive (10 bit) its back down to 3ms again. Edit #2: Unchecking "present a frame for every VSync" also seems to be a way to bring it back down to 3ms while in (D3D11) Windowed Full Screen mode.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.

Last edited by Anime Viewer; 13th July 2015 at 03:30. Reason: added details about D3D11 mode effects
Anime Viewer is offline   Reply With Quote
Old 13th July 2015, 03:46   #31711  |  Link
6ari8
Registered User
 
Join Date: Jun 2012
Posts: 43
Quote:
Originally Posted by Anime Viewer View Post
I realized why I had it checked, and it wasn't because of render rate stats, but present stats. With "don't use linear light for dithering" checked and the video full screen my present rates are in the 3ms area, but with it unchecked and the video full screen they are in the 11ms area.

Edit: Windowed Full Screen (D3D11) runs with the 11ms presents, but if I run in D3D11 Exclusive (10 bit) its back down to 3ms again. Edit #2: Unchecking "present a frame for every VSync" also seems to be a way to bring it back down to 3ms while in (D3D11) Windowed Full Screen mode.
Are you using an HDMI port? I think you can probably bring down the present stats to below 1ms by using the DisplayPort (though that depends on the laptop). You can see which port you have that's connected to the discreet GPU from Nvidia's control panel. Mine has the mDP only but there are different cases for laptops like these:


6ari8 is offline   Reply With Quote
Old 13th July 2015, 03:48   #31712  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Today, I attempted to embrace image sharpening as tool to improve an image rather than enhance its artifacts. I found the following settings beneficial in improving the quality of high-definition video:

1080p -> 1080p
Image Enhancements - FineSharp (strength 0.6)

720p -> 1080p
Upscaling Refinement - SuperRes (medium: strength=1.0; passes=1)

SuperRes is my favourite of the two. Its ringing is not obvious or bothersome compared to other sharpeners. I'm curious if the luma and chroma are doubled together or just the luma like the other shaders?

An AR filter for FineSharp would be great.

Last edited by Warner306; 13th July 2015 at 07:34.
Warner306 is offline   Reply With Quote
Old 13th July 2015, 05:10   #31713  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by 6ari8 View Post
Are you using an HDMI port? I think you can probably bring down the present stats to below 1ms by using the DisplayPort (though that depends on the laptop). You can see which port you have that's connected to the discreet GPU from Nvidia's control panel. Mine has the mDP only but there are different cases for laptops like these:
Yes, I have an HDMI port that I connect my TV to, but both the TV and the laptop screen are shown to go through the Intel GPU like the second image you posted. Here is the PhysX display diagram on my system:
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 13th July 2015, 05:42   #31714  |  Link
Nachbar
Registered User
 
Join Date: Jun 2012
Posts: 33
Does madvr support 36-bit deep color output or does it cap out at 30-bit?

I ask this because I believe my receiver is taking 30-bit output and down converting it to 24-bit. It is a Pioneer VSX 1021-K which the specs say it supports 36-bit deep color and no mention of anything else.

If I connect my computer directly to my TV it is fine and the test picture I am viewing shows no banding but if I go through the receiver it shows banding.

I have made sure the display properties in madvr is set to 10-bit and higher and made sure in catalyst control center that the display is set to 12-bit.

The manual doesn't mention if only certain HDMI ports accept this or not. I could try moving it around and testing a bit more. I am using the same input on the TV that works and made sure it is set to PC and UHD color.
Nachbar is offline   Reply With Quote
Old 13th July 2015, 06:23   #31715  |  Link
6ari8
Registered User
 
Join Date: Jun 2012
Posts: 43
Quote:
Originally Posted by Anime Viewer View Post
Yes, I have an HDMI port that I connect my TV to, but both the TV and the laptop screen are shown to go through the Intel GPU like the second image you posted. Here is the PhysX display diagram on my system:
Mine's like the second one. Yours seems to be like the third image.

Unfortunately, that means that all your output ports are connected to the Intel GPU.

I don't know what gaming laptop makers were thinking by choosing to connect output ports to the iGPU. I think anyone using their laptop to drive monitors with those ports would be using a power supply so connecting them to the discreet GPU would've been a much better choice.

With your setup, I think you should use exclusive mode with either D3D9 or D3D11 because that brings down the present stats to around 1-3ms for me when I'm using my HDMI port. Windowed mode gives me 8-9ms present.

Last edited by 6ari8; 13th July 2015 at 06:27.
6ari8 is offline   Reply With Quote
Old 13th July 2015, 07:37   #31716  |  Link
trip_let
Registered User
 
Join Date: Sep 2012
Posts: 47
I was checking through a variety of content to find bad samples for s-xbr and just ran across a case where s-xbr doubling at any sharpness gets confused. I'm sure there are others, but in case someone's never seen what weirdness can happen sometimes, here it is.

s-xbr 50 vs. nnedi3 16
http://screenshotcomparison.com/comparison/134926

Original image


Doesn't really matter which s-xbr setting; it gets confused and erases some of the lines in the background grid in the box with the 32275GP with all of them. And there's nothing special about nnedi3, which was used for comparison. It could have been anything else. I think SuperRes may be on for both, but that's besides the point and not consequential.
trip_let is offline   Reply With Quote
Old 13th July 2015, 08:01   #31717  |  Link
iSunrise
Registered User
 
Join Date: Dec 2008
Posts: 496
@trip_let:
Your example perfectly shows, why super-xbr in it's current state is not a contender to NNEDI3 at all. Personally, I found the loss of picture details disturbing and also the strange alterations in various samples. Your example (and other tests I've done yesterday) only show that:

1) super-xbr seems to completely erase/destroy very visible picture details and also fine detail (look at the various 1s at the bottom, it almost modifies the 1 to an I, unacceptable)
2) NNEDI3 is a lot sharper than super-xbr (which is a result of the loss of fine detail)
3) super-xbr "rounds everything" and acts as some kind of anti-aliasing filter, adding picture information where there was none before

People that want an accurate representation of the original should stay away from it, even though it might be a lot faster.

Personally, I found that the new bilateral chroma scaler is extremely promising, at least on the samples I've watched closely, it's also extremely fast, looks great and is a perfect combination with NNEDI3 doubling and Bicubic50/75 for luma upscaling. Very good results if you can take the NNEDI3 performance hit.
iSunrise is offline   Reply With Quote
Old 13th July 2015, 08:25   #31718  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,336
Pixel Art is a special kind of content, and resizers/doublers designed for generic content will just not always handle it properly. Its not a very convincing example of anything other than the fact that it doesn't work nicely for Pixel Art.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 13th July 2015, 10:38   #31719  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
Quote:
Originally Posted by leeperry View Post
I would also be cool with creating a folder in mVR's folder such as "SuperRes HQ OFF" or something if that's not too much trouble please.
Ok, I could live with that. Will add it to a future build.

Quote:
Originally Posted by kasper93 View Post
@madshi: Something weird is going on. If I make MPC-HC window size equal to screen size madVR will go into exclusive mode and position window in center even though it weren't before. If I open context menu to disable exclusive mode it window go back to its original place. Also seekbar doesn't work in this "exclusive mode" probably it is not reported correctly.

To reproduce open any file in MPC-HC. And stretch the window to display size (not work area size). Basically this means to Make MPC-HC max size. To do that you need manual grab window borders and stretch it.

I will look into details tommorow. But it looks like madVR have "if (windowSize == displaySize) GoToRetardedExclusiveMode();" logic
That sounds quite weird! I'm having trouble reproducing it, though. I suppose the MPC-HC window is not centered on the screen in this situation, is it? Could you create a debug log that shows this situation?

Quote:
Originally Posted by Warner306 View Post
Today, I attempted to embrace image sharpening as tool to improve an image rather than enhance its artifacts. I found the following settings beneficial in improving the quality of high-definition video:

1080p -> 1080p
Image Enhancements - FineSharp (strength 0.6)

720p -> 1080p
Upscaling Refinement - SuperRes (medium: strength=1.0; passes=1)

SuperRes is my favourite of the two. Its ringing is not obvious or bothersome compared to other sharpeners. I'm curious if the luma and chroma are doubled together or just the luma like the other shaders?

An AR filter for FineSharp would be great.
I had tried to add an AR filter for FineSharp, but my first try didn't work very well. Will have to try again later...

Quote:
Originally Posted by Nachbar View Post
Does madvr support 36-bit deep color output or does it cap out at 30-bit?
Direct3D11 supports either 8bit, 10bit or 16bit, but not 12bit. The GPU drivers make of that whatever they want. If I output 10bit, some GPU drivers output 10bit, others 12bit, others 8bit or 16bit. So in other words I don't have exact control over what the GPU outputs, unfortunately.

Quote:
Originally Posted by trip_let View Post
I was checking through a variety of content to find bad samples for s-xbr and just ran across a case where s-xbr doubling at any sharpness gets confused. I'm sure there are others, but in case someone's never seen what weirdness can happen sometimes, here it is.

s-xbr 50 vs. nnedi3 16
http://screenshotcomparison.com/comparison/134926

Original image


Doesn't really matter which s-xbr setting; it gets confused and erases some of the lines in the background grid in the box with the 32275GP with all of them. And there's nothing special about nnedi3, which was used for comparison. It could have been anything else. I think SuperRes may be on for both, but that's besides the point and not consequential.
Interesting!

Quote:
Originally Posted by iSunrise View Post
@trip_let:
Your example perfectly shows, why super-xbr in it's current state is not a contender to NNEDI3 at all. Personally, I found the loss of picture details disturbing and also the strange alterations in various samples. Your example (and other tests I've done yesterday) only show that:

1) super-xbr seems to completely erase/destroy very visible picture details and also fine detail (look at the various 1s at the bottom, it almost modifies the 1 to an I, unacceptable)
2) NNEDI3 is a lot sharper than super-xbr (which is a result of the loss of fine detail)
3) super-xbr "rounds everything" and acts as some kind of anti-aliasing filter, adding picture information where there was none before

People that want an accurate representation of the original should stay away from it, even though it might be a lot faster.
Quote:
Originally Posted by nevcairiel View Post
Pixel Art is a special kind of content, and resizers/doublers designed for generic content will just not always handle it properly. Its not a very convincing example of anything other than the fact that it doesn't work nicely for Pixel Art.
^

Agree with nevcairiel. Although super-xbr was originally made for pixel art! Which means that maybe Hyllian may want to look into this issue? Not sure if it's easily fixable, though.

In any case, yes, NNEDI3 is superior to super-xbr in quality - but at a multiple of the performance cost. It's your decision which algo to use, of course.

Quote:
Originally Posted by iSunrise View Post
Personally, I found that the new bilateral chroma scaler is extremely promising, at least on the samples I've watched closely, it's also extremely fast, looks great and is a perfect combination with NNEDI3 doubling and Bicubic50/75 for luma upscaling. Very good results if you can take the NNEDI3 performance hit.
There are some samples which show significant problems with the bilateral chroma upscaler, though. So once we concentrate on that, we'll have to collect such samples and try to fix them. For now I'm still concentrated on SuperRes.
madshi is offline   Reply With Quote
Old 13th July 2015, 10:58   #31720  |  Link
madshi
Registered Developer
 
Join Date: Sep 2006
Posts: 9,140
madVR v0.88.17 released

http://madshi.net/madVR.zip

Code:
* madVR now renders in paused and stopped mode, too
* added automatic OSD low latency logic
* added SuperRes anti-ringing filter
* fixed little SuperRes quality detoriation introduced in v0.88.16
* fixed: high GPU consumption in paused mode (PotPlayer, Kodi DSPlayer)
* all (useful) IVideoWindow APIs now work even when no pins are connected
For users, this build is probably not much different to Saturday's test build. Well, at least I hope so. However, for some media player developers there's a big (positive) change. I've been asked to do rendering in paused and stopped modes for a long time, and now finally it's implemented. This required some deeper changes, though, so there's a certain danger of new bugs showing up.

Notes for media player developers:

1) Please set the owner/parent *before* you connect the pins.
2) All the various OSD interfaces in madVR now also work in paused and stopped mode. Maybe you can make use of it in some way?
3) If you're using IOsdRenderCallback, *PLEASE* make sure that your ClearBackground() and RenderOsd() callbacks return "ERROR_EMPTY" if there is no active OSD on screen. This is very important because if you don't return ERROR_EMPTY, madVR will switch into low latency mode to speed up your OSD reaction times. This is good for OSD latency, but not good for video playback reliability.
4) If you're using IMadVROsdServices::OsdSetBitmap, there's a new flag (see header files) that tells madVR whether your OSD bitmap needs low latency or not. Low latency makes sense for OSD elements the user can use to control something, but probably not for purely informational OSD elements.
5) You can see whether madVR is in low latency mode by checking the size of the "present queue". In low latency mode this queue is limited to 2 frames.
madshi is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 03:51.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.