Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
12th July 2015, 19:55 | #31701 | Link |
Registered User
Join Date: Dec 2011
Posts: 1,812
|
Mostly agreed.
I really like the effect of AS as an UR with low values (e.g. 0.2) for NNEDI3. Everything's still well reconstructed, but that softness disappears for a huge amount. Especially if sharpened after each 2x upscaling step. Thanks to its adaptive nature, it doesn't oversharp already sharpened areas and there is very little aliasing introduced. Maybe this works a bit better with naturalistic content than with artificial one due to that thickening of black lines. But without sharpening, those lines are very soft with NNEDI3 though. |
12th July 2015, 23:33 | #31702 | Link |
Guest
Posts: n/a
|
I still get a ton of presentation glitches on my 120Hz monitor, even if I set it to 60Hz. The same exact settings work perfectly fine on my TV @ 23Hz. I don't get it. Average rendering times are identical on both TV and monitor - about 27-28ms (same as max stats). I can reduce presentation glitches on 120Hz monitor by enabling "present a frame for every VSync", but then still occur (1 every 10 seconds or so). However, enabling this option during playback on my TV @ 23Hz creates dropped frames and presentation glitches. Setting "present several frames in advance" to 1 also helped to reduce presentation glitches (1 every 15-30 seconds). Using less GPU-taxing settings made no difference.
The second issue I get on my TV is that at times when I pause playback and resume it, I get severe playback stuttering, which I can fix by exiting fullscreen exclusive mode to window mode and then going back to fullscreen exclusive mode. That does not happen on my 120Hz monitor though... I reset madVR settings to default, uninstall it as admin, restarted, install madVR as admin, reset settings back to default, and applied settings I use. That did not help. EDIT: Could these issues because I set "Maximum Pre-Rendered Frames" to 1 in NVidia Control Panel? I set it so because it reduces stuttering associated with using Borderless Window mode in PC games. Last edited by XMonarchY; 12th July 2015 at 23:45. |
12th July 2015, 23:42 | #31703 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
High Refresh Rate screens can be a bit fiddly. You may need to increase the queue sizes quite a bit to accomodate for the higher refresh rate, that tends to help for me to reduce the glitches, and "present a frame every vsync" is pretty much mandatory.
Personally, its not like I see the glitches, it just piles up on the counter. PS: There will never be settings that work 100% perfect on every system, so having to change a setting here or there depending if you're playing 23.976 on a 120Hz screen, or having it play 1:1 on a 23p TV is not uncommon.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
13th July 2015, 00:18 | #31704 | Link |
MPC-HC Developer
Join Date: May 2010
Location: Poland
Posts: 586
|
@madshi: Something weird is going on. If I make MPC-HC window size equal to screen size madVR will go into exclusive mode and position window in center even though it weren't before. If I open context menu to disable exclusive mode it window go back to its original place. Also seekbar doesn't work in this "exclusive mode" probably it is not reported correctly.
To reproduce open any file in MPC-HC. And stretch the window to display size (not work area size). Basically this means to Make MPC-HC max size. To do that you need manual grab window borders and stretch it. I will look into details tommorow. But it looks like madVR have "if (windowSize == displaySize) GoToRetardedExclusiveMode();" logic |
13th July 2015, 00:24 | #31705 | Link | |
Guest
Posts: n/a
|
Quote:
|
|
13th July 2015, 00:31 | #31706 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Quote:
Luckily the NVIDIA panel has per-app settings!
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
|
13th July 2015, 01:05 | #31709 | Link |
Registered User
Join Date: Dec 2011
Posts: 1,812
|
Vsync, prerenderlimit, power management mode and Optimus settings are the only driver-settings that should affect madVR. Everything should be kept at driver default (app-controlled) for the profile of your media player.
If you change settings globally, you have to set them back for each profile. |
13th July 2015, 03:10 | #31710 | Link | |
Troubleshooter
Join Date: Feb 2014
Posts: 339
|
Quote:
Edit: Windowed Full Screen (D3D11) runs with the 11ms presents, but if I run in D3D11 Exclusive (10 bit) its back down to 3ms again. Edit #2: Unchecking "present a frame for every VSync" also seems to be a way to bring it back down to 3ms while in (D3D11) Windowed Full Screen mode.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV. Last edited by Anime Viewer; 13th July 2015 at 03:30. Reason: added details about D3D11 mode effects |
|
13th July 2015, 03:46 | #31711 | Link | |
Registered User
Join Date: Jun 2012
Posts: 43
|
Quote:
|
|
13th July 2015, 03:48 | #31712 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Today, I attempted to embrace image sharpening as tool to improve an image rather than enhance its artifacts. I found the following settings beneficial in improving the quality of high-definition video:
1080p -> 1080p Image Enhancements - FineSharp (strength 0.6) 720p -> 1080p Upscaling Refinement - SuperRes (medium: strength=1.0; passes=1) SuperRes is my favourite of the two. Its ringing is not obvious or bothersome compared to other sharpeners. I'm curious if the luma and chroma are doubled together or just the luma like the other shaders? An AR filter for FineSharp would be great. Last edited by Warner306; 13th July 2015 at 07:34. |
13th July 2015, 05:10 | #31713 | Link | |
Troubleshooter
Join Date: Feb 2014
Posts: 339
|
Quote:
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV. |
|
13th July 2015, 05:42 | #31714 | Link |
Registered User
Join Date: Jun 2012
Posts: 33
|
Does madvr support 36-bit deep color output or does it cap out at 30-bit?
I ask this because I believe my receiver is taking 30-bit output and down converting it to 24-bit. It is a Pioneer VSX 1021-K which the specs say it supports 36-bit deep color and no mention of anything else. If I connect my computer directly to my TV it is fine and the test picture I am viewing shows no banding but if I go through the receiver it shows banding. I have made sure the display properties in madvr is set to 10-bit and higher and made sure in catalyst control center that the display is set to 12-bit. The manual doesn't mention if only certain HDMI ports accept this or not. I could try moving it around and testing a bit more. I am using the same input on the TV that works and made sure it is set to PC and UHD color. |
13th July 2015, 06:23 | #31715 | Link | |
Registered User
Join Date: Jun 2012
Posts: 43
|
Quote:
Unfortunately, that means that all your output ports are connected to the Intel GPU. I don't know what gaming laptop makers were thinking by choosing to connect output ports to the iGPU. I think anyone using their laptop to drive monitors with those ports would be using a power supply so connecting them to the discreet GPU would've been a much better choice. With your setup, I think you should use exclusive mode with either D3D9 or D3D11 because that brings down the present stats to around 1-3ms for me when I'm using my HDMI port. Windowed mode gives me 8-9ms present. Last edited by 6ari8; 13th July 2015 at 06:27. |
|
13th July 2015, 07:37 | #31716 | Link |
Registered User
Join Date: Sep 2012
Posts: 47
|
I was checking through a variety of content to find bad samples for s-xbr and just ran across a case where s-xbr doubling at any sharpness gets confused. I'm sure there are others, but in case someone's never seen what weirdness can happen sometimes, here it is.
s-xbr 50 vs. nnedi3 16 http://screenshotcomparison.com/comparison/134926 Original image Doesn't really matter which s-xbr setting; it gets confused and erases some of the lines in the background grid in the box with the 32275GP with all of them. And there's nothing special about nnedi3, which was used for comparison. It could have been anything else. I think SuperRes may be on for both, but that's besides the point and not consequential. |
13th July 2015, 08:01 | #31717 | Link |
Registered User
Join Date: Dec 2008
Posts: 496
|
@trip_let:
Your example perfectly shows, why super-xbr in it's current state is not a contender to NNEDI3 at all. Personally, I found the loss of picture details disturbing and also the strange alterations in various samples. Your example (and other tests I've done yesterday) only show that: 1) super-xbr seems to completely erase/destroy very visible picture details and also fine detail (look at the various 1s at the bottom, it almost modifies the 1 to an I, unacceptable) 2) NNEDI3 is a lot sharper than super-xbr (which is a result of the loss of fine detail) 3) super-xbr "rounds everything" and acts as some kind of anti-aliasing filter, adding picture information where there was none before People that want an accurate representation of the original should stay away from it, even though it might be a lot faster. Personally, I found that the new bilateral chroma scaler is extremely promising, at least on the samples I've watched closely, it's also extremely fast, looks great and is a perfect combination with NNEDI3 doubling and Bicubic50/75 for luma upscaling. Very good results if you can take the NNEDI3 performance hit. |
13th July 2015, 08:25 | #31718 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Pixel Art is a special kind of content, and resizers/doublers designed for generic content will just not always handle it properly. Its not a very convincing example of anything other than the fact that it doesn't work nicely for Pixel Art.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
13th July 2015, 10:38 | #31719 | Link | ||||||||
Registered Developer
Join Date: Sep 2006
Posts: 9,140
|
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Quote:
Agree with nevcairiel. Although super-xbr was originally made for pixel art! Which means that maybe Hyllian may want to look into this issue? Not sure if it's easily fixable, though. In any case, yes, NNEDI3 is superior to super-xbr in quality - but at a multiple of the performance cost. It's your decision which algo to use, of course. Quote:
|
||||||||
13th July 2015, 10:58 | #31720 | Link |
Registered Developer
Join Date: Sep 2006
Posts: 9,140
|
madVR v0.88.17 released
http://madshi.net/madVR.zip Code:
* madVR now renders in paused and stopped mode, too * added automatic OSD low latency logic * added SuperRes anti-ringing filter * fixed little SuperRes quality detoriation introduced in v0.88.16 * fixed: high GPU consumption in paused mode (PotPlayer, Kodi DSPlayer) * all (useful) IVideoWindow APIs now work even when no pins are connected Notes for media player developers: 1) Please set the owner/parent *before* you connect the pins. 2) All the various OSD interfaces in madVR now also work in paused and stopped mode. Maybe you can make use of it in some way? 3) If you're using IOsdRenderCallback, *PLEASE* make sure that your ClearBackground() and RenderOsd() callbacks return "ERROR_EMPTY" if there is no active OSD on screen. This is very important because if you don't return ERROR_EMPTY, madVR will switch into low latency mode to speed up your OSD reaction times. This is good for OSD latency, but not good for video playback reliability. 4) If you're using IMadVROsdServices::OsdSetBitmap, there's a new flag (see header files) that tells madVR whether your OSD bitmap needs low latency or not. Low latency makes sense for OSD elements the user can use to control something, but probably not for purely informational OSD elements. 5) You can see whether madVR is in low latency mode by checking the size of the "present queue". In low latency mode this queue is limited to 2 frames. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|