Quote:
Originally Posted by cokeefe
You don't say what GPU you are running, but I've had similar behavior with my RX 480 (loss of signal/sync, corruption). Combined with the poor NGU performance and drivers, I am regretting buying my first non-NVidia card for my dedicated HTPC.
Two things did help in my case:
-force vsync always on in the Crimson driver - it does not seem to honor the "let application decide" flag with MadVR.
-decrease both CPU and GPU queues to the minimum 4
These have reduced but not completely eliminated the flickering (DX11 exclusive).
On another note, @Madshi, I'm one of the silent users who have enjoyed MadVR for several years but never post, as I am not as technical as most folks here. THANK YOU for this remarkable software (especially NGU!)
I also appreciate the move toward simplification and self-configuration - it makes MadVR so much more accessible to the "dull average" crowd I count myself among!
|
What driver are you using? The new 16.12.1 drivers are much better than older drivers. I got shot down by saying that a couple of pages back, but seeing as it is working 'perfectly' for me I'm happy with them. D3D 11 Exclusive mode, with thin edges, enhance details, adaptive sharpen shaders, no trading quality for performance, using NGU etc.
Now there are two schools of thought regarding GPU capability. he first is the max out all the settings including NGU etc, so when using a side by side comparison with your eyes 2 inches from a 55 inch screen you may be able to tell a slight quality difference between those settings and lower settings, in the right scene. The second is to use more realistic settings and let the GPU peacefully tick over, but without compromising perceptive quality. The last point is important!
Basically there is no point (in my opinion) to use higher settings when there is no perceptible difference (and not psychological) between actually watching the video as you would (not doing zoomed in analysis or looking at still images closely), especially when you may be using 3 times the GPU power to do so.
Download the latest GPU-z from here:
https://www.techpowerup.com/downloads/SysInfo/GPU-Z/
Once running (you don't need to install, just select 'No' or 'Not Now'), click on the sensors tab. This will show you the GPU clocks, temperature, load, and GPU power draw (on the newer cards). Assuming you have dual screens, play back on the screen you wish to play back on, and have GPU-z on the second screen. Watch the GPU use and adjust settings as necessary. I use NGU Medium (or high) for Luma, Jinc AR for upscaling after doubling (only used if requried), and for downscaling 'use "image downscaling" settings'. For those SSIM is the sharpest but introduces aliasing, 2D 100% is the sharpest but the most GPU unfriendly. I think a good compromise is Jinc with linear light and anti-ringing enabled. For Chroma upscaling having Jinc with anti-ringing and superres (1 or 2) is good.
With upscaling refinement I have 'add grain', and for dithering I have 'Random dithering' enabled. Error Diffusion may be 'better', but have others have pointed out unnecessary when outputting to a 8 or 10 bit display.
So, with the latest drivers, try:
Devices (set to highest display depth the display can handle)
Artifact removal: Debanding Low/high or medium/high, reduce ringing artifacts
Image enhancements: thin edges (1.0), enhance detail (1.0), adaptivesharpen (0.3, for example), anti-rining.
Chroma upscaling: Jinc, anti-ringing, SuperRes (1 or 2)
Image downscaling: Jinc, linear ligh, anti-rining relaxed
Image upscaling: NGU medium (or high
IF you can actually see the diffence when watching a video), after doubling algorithm Jinc AR, use image downscaling settings for downscaling.
Upscaling refinement: add grain (maybe 1 or 2 depending on your taste). A tiny bit of grain added can perceptively bring out detaisl whilst hiding artifacts.
Rendering. General settings. Delay playback until queue is full, overlay mode (not sure if that does anything in Win 10), Fullscreen exclsuive mode, separate device both enabled. Large CPU and GPU queue size seems to work fine (say, 64 and 8 or whatever).
The windowed and exclusive mode settings are fine as default (present 8 in advance etc), random dithering, and no trade quality for performance settings.
Now on drivers before 16.12.1 these settings wouldn't work, mainly because of DX 11 Exclusive. It would also not be smooth etc. On 16.12.1 with these settings the GPU just ticks over nicely. You will see the GPU only power draw being low as a result of the GPU clock remaining low, which means the card runs cool, and remains under 60C (for me). It may be so that the fans don't even have to run!
If you need to zoom in to see if something is better, use a ridiculously low resolution (320x240 for example) and blow it up to 4k resolution on a 60 inch screen and then look closely at a still image to see the difference, or any variation of not actually watching it normally, to be able to tell the difference, then the difference is not worth quibbling over if the settings differences require much more processing power.
Back to the new driver, I can even have the computer turning off the display etc when in paused windows mode, and now it will just wake up and play perfectly again!
. Actually, all faults I reported earlier seem to be fixed, so it probably was a driver issue. Just strange Nvidia users were affected as well, it's probably related to updated API support (such as SM 6.0, DirectX 11.4, WDDM 2.1 etc and other changes) in the driver no being perfected, particularly since for both Nvidia and AMD you need the new cards (GTX 1000 series/RX 400 series) to make use of it.
When starting playback after monitor turn off/resume you do need to go back to full screen mode, or windowed mode then fullscreen if you paused in fullscreen, to reset the output. CLSID mentioned this earlier, it's not a driver or madVR thing, it's just a thing with Windows. There haven't been any associated issues with this (such as MPC-HC freezing etc) with the new drivers.