Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 25th May 2019, 21:36   #56321  |  Link
ashlar42
Registered User
 
Join Date: Jun 2007
Posts: 652
Quote:
Originally Posted by bigboyman View Post
Is there a reason for having "Reduce compression artifacts" at 9? Such a high value could make sense just for really poor sources. I'd never use it on high resolution material, personally.
ashlar42 is offline   Reply With Quote
Old 25th May 2019, 21:38   #56322  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Tried using 2.0 instead of supersampling and it actually made it slightly worse (~70ms to ~80ms).

Here's the OSD with ShowRenderSteps (as many as I can see)
bigboyman is offline   Reply With Quote
Old 25th May 2019, 21:40   #56323  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Quote:
Originally Posted by ashlar42 View Post
Is there a reason for having "Reduce compression artifacts" at 9? Such a high value could make sense just for really poor sources. I'd never use it on high resolution material, personally.
I use this bulid mostly for files with resolution below 720p (often 480p) which is why it's so high.
bigboyman is offline   Reply With Quote
Old 25th May 2019, 21:44   #56324  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
your settings posted doesn't match the OSD and always supersampling or 2.0 have no effect in this case.

error diffusion is used which will melt the GPU. jinc scaling is used which will melt the GPU. most of the post processing filters are used which again will melt the GPU.
huhn is offline   Reply With Quote
Old 25th May 2019, 21:56   #56325  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Quote:
Originally Posted by huhn View Post
your settings posted doesn't match the OSD and always supersampling or 2.0 have no effect in this case.

error diffusion is used which will melt the GPU. jinc scaling is used which will melt the GPU. most of the post processing filters are used which again will melt the GPU.
What settings are different in the OSD?
And what alternatives do you suggest I use?
bigboyman is offline   Reply With Quote
Old 25th May 2019, 22:01   #56326  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
i don't even know where to start. deblocking should be used with NGU sharp for chroma to spare a bit of processing power but it alone is already alot of work for 960.
start from scratch.
huhn is offline   Reply With Quote
Old 25th May 2019, 22:07   #56327  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Yes, for a 960 you will need to turn down or disable some settings, mostly you can only use a few pre/post processing filters. Ordered dithering is also great quality and much faster than error defusion.

If you really think it is something odd happening after three minutes do not look for issues in madVR, nothing changes in madVR after three minutes. That would be due to overheating or power saving options or similar. Is Optimus behaving badly again? That tech does not work very well in my experience.

Monitor GPU clocks while watching madVR, hwinfo is a good tool.

Edit: try running without any pre/post filters with show render steps so you can see all of them. Is anything obviously very slow?
__________________
madVR options explained

Last edited by Asmodian; 25th May 2019 at 22:10.
Asmodian is offline   Reply With Quote
Old 26th May 2019, 01:46   #56328  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Quote:
Originally Posted by Asmodian View Post
Yes, for a 960 you will need to turn down or disable some settings, mostly you can only use a few pre/post processing filters. Ordered dithering is also great quality and much faster than error defusion.

If you really think it is something odd happening after three minutes do not look for issues in madVR, nothing changes in madVR after three minutes. That would be due to overheating or power saving options or similar. Is Optimus behaving badly again? That tech does not work very well in my experience.

Monitor GPU clocks while watching madVR, hwinfo is a good tool.

Edit: try running without any pre/post filters with show render steps so you can see all of them. Is anything obviously very slow?
Using fewer pre/post processing filters (image corrections, not upscaling/downscaling) does improve performance by about 10ms but it's also very important for good video quality at low-res meaning disabling kinda defeats the purpose of this whole setup.

Tried ordered dithering and it DOES improve performance without any visible quality loss but by only about 5ms so not a huge gain.

Clock is stable so not an issue but the GPU is hot (~85ºC - hot but not overheat hot) when madVR is working (with 70ms render time). But then again, that's probably due to the hardware limits and not so much Optimus drivers. I should also probably mention I don't have energy saving mode enabled.

Now that I can see the rest of ShowRenderSteps the composition of render time (~65ms because ordered dithering) is as follows (approximately):
  • Image corrections - 10ms
  • Deblock - 8ms
  • Chorma Upscaling - <1ms
  • Debanding - 2ms
  • Deringing - 7ms
  • NGU Doubling - 9ms
  • Jinc Upscaling - 10ms
  • Image scaling X+Y - 5.5ms
  • Subtitles - 7.5ms
  • Final Step - 3ms
What do you think I can afford to remove / compromise?

Last edited by bigboyman; 26th May 2019 at 01:53.
bigboyman is offline   Reply With Quote
Old 26th May 2019, 01:51   #56329  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Quote:
Originally Posted by huhn View Post
i don't even know where to start. deblocking should be used with NGU sharp for chroma to spare a bit of processing power but it alone is already alot of work for 960.
start from scratch.
Thanks for the advice. I don't notice any graphical difference (is there supposed to be one?) and can shave 5ms thanks to this.

So far, this thread has already made me lose about 10ms with next to no graphical compromise. Keep it up guys.
bigboyman is offline   Reply With Quote
Old 26th May 2019, 02:15   #56330  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
bigboyman,

I've been using a GTX 960 4GB for years. I still do because I have no reason to upgrade until there is a reason. Every one of your settings is different from mine and I've been seeking the most performance I can obtain out of the 960. I'll go as far as telling you they are all completely wrong tbh. Even your LAV Filter settings. I use settings and profiles for every format there is because I have a very diverse collection. 3D, 2D, SDR, HDR, 2160p through 580p, etc. Imo, your best starting point, and perhaps ending point, is provided for you in the guide in my signature. You can d/l the madVR .bin file and replace yours in your madVR folder to make things very simple. It's a copy of what I use. There are also links to pictures with settings if you'd rather do it manually. Granted I am using a 4k display and I think you're using 1080p max. The settings still apply.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 26th May 2019, 02:42   #56331  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
just show advanced rendertimes gives you higher rendertimes i never measured it but it's more then zero.

next problem is that you are currently scaling 480p the rendertimes with a 720p source for deblocking will sky rocket.

is this a 960m? what is the clock? 85 C is pretty high for a maxwell GPU.

subtitles should only take 1-3 MS on a 960 even at UHD.

when deblocking is used with a strong setting deringing doesn't do much anymore try to remove deringing if you still get ringing issues try a stronger deblocking/RRN setting.

upscale refinement is recommended over the use of image enhancement. try to replace the 4 top "image enhancer" with the way cheaper AdaptiveSharpen beware replacing image enhancement with upscaling cost more processing power and usually needs higher settings but Adaptive sharpening is so much cheaper and you are not creating as much upscaling artifacts. i can't remember a single use case where enhance details is useful for sources that are deblocked because a deblocking creates huge parts with no details.
the 4 top "enhancer" work best with clean/good source anyway.

using jinc after doubling has a high diminishing return lanczos 3 AR should be similar.

if you can't framerate match using smoothmotion is mandetory in my opinion.

NGU sharp should be massively better for chroma quality but in general it is not a chroma scaler i would use but it's free with deblocking unlike super XBR or NGU AA.

super XBR 100 is worth a shoot it should be faster then NGU AA mid but it looks clearly different.
you could also try always super sample for quadrupling (not for doubling) beware this can backfire spectacularly.

Quote:
Using fewer pre/post processing filters (image corrections, not upscaling/downscaling) does improve performance by about 10ms but it's also very important for good video quality at low-res meaning disabling kinda defeats the purpose of this whole setup.
this is misguided using more doesn't mean it will result in "more".
see deringing as an example.
huhn is offline   Reply With Quote
Old 26th May 2019, 02:42   #56332  |  Link
tp4tissue
Registered User
 
tp4tissue's Avatar
 
Join Date: May 2013
Posts: 706
where do i enable show render steps.
__________________
Ghetto | 2500k 5Ghz
tp4tissue is offline   Reply With Quote
Old 26th May 2019, 03:01   #56333  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
https://forum.doom9.org/showthread.p...84#post1709584
hidden options and there is a reason it is a hidden option.
huhn is offline   Reply With Quote
Old 26th May 2019, 18:54   #56334  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Quote:
Originally Posted by brazen1 View Post
bigboyman,

I've been using a GTX 960 4GB for years. I still do because I have no reason to upgrade until there is a reason. Every one of your settings is different from mine and I've been seeking the most performance I can obtain out of the 960. I'll go as far as telling you they are all completely wrong tbh. Even your LAV Filter settings. I use settings and profiles for every format there is because I have a very diverse collection. 3D, 2D, SDR, HDR, 2160p through 580p, etc. Imo, your best starting point, and perhaps ending point, is provided for you in the guide in my signature. You can d/l the madVR .bin file and replace yours in your madVR folder to make things very simple. It's a copy of what I use. There are also links to pictures with settings if you'd rather do it manually. Granted I am using a 4k display and I think you're using 1080p max. The settings still apply.
That seems pretty interesting and I'd definitely want to try it but I can't find any mention of the madVR .bin in the post in your signature.
Maybe because it's so long. If you could provide a direct link that would really help me out.
bigboyman is offline   Reply With Quote
Old 26th May 2019, 20:00   #56335  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Quote:
Originally Posted by huhn View Post
...

is this a 960m? what is the clock? 85 C is pretty high for a maxwell GPU.
As far as I've seen, 80ºC is the point at which the transition from 20ms to 60ms occurs. It doesn't really ever go above 85ºC.
Not sure what the significance of that is but it could be useful info.

Quote:
Originally Posted by huhn View Post
subtitles should only take 1-3 MS on a 960 even at UHD.
Using the internal decoder instead of XYsubs or ASSmod doesn't make a difference.
Not sure what I could use here. What settings on madVR would reduce subtitle render time?

Quote:
Originally Posted by huhn View Post
when deblocking is used with a strong setting deringing doesn't do much anymore try to remove deringing if you still get ringing issues try a stronger deblocking/RRN setting.
Do you mean to disable "Reduce ringing artifacts" or the anti-ringing filter?
Anyway, I disabled both. The difference is not insingificant - the ringing becomes a lot more obvious.
I wouldn't call it distracting, but definitely more noticeable.
Anyway, disabling "Reduce ringing artifacts" shaves about 8ms and the AR filter only about 1ms.
I chose to disable both since it's a huge performace gain (9-10ms) for only slightly annoying ringing and the AR filter barely improves ringing anyway.

Quote:
Originally Posted by huhn View Post
upscale refinement is recommended over the use of image enhancement. try to replace the 4 top "image enhancer" with the way cheaper AdaptiveSharpen beware replacing image enhancement with upscaling cost more processing power and usually needs higher settings but Adaptive sharpening is so much cheaper and you are not creating as much upscaling artifacts. i can't remember a single use case where enhance details is useful for sources that are deblocked because a deblocking creates huge parts with no details.
the 4 top "enhancer" work best with clean/good source anyway.
Only saves like 2ms and looks significantly worse, so I'm not doing this one.

Quote:
Originally Posted by huhn View Post
using jinc after doubling has a high diminishing return lanczos 3 AR should be similar.
Actually god-tier advice. Goes down by 9-10ms and looks the same.

Quote:
Originally Posted by huhn View Post
if you can't framerate match using smoothmotion is mandetory in my opinion.
Don't think this is the problem not to mention my render time shoots up like crazy (30ms up).

Quote:
Originally Posted by huhn View Post
NGU sharp should be massively better for chroma quality but in general it is not a chroma scaler i would use but it's free with deblocking unlike super XBR or NGU AA.
...
Already using it, thanks

------------------------------------------------------------------------------------------

So, just to recap: Right now, despite some slightly annoying ringing, I've managed to get render time ~42ms. This is pretty good, but I'd like it to be consistently lower since it's still to close to 1/24, which means there are still frequent (but massively less noticeable) frame drops.

EDIT: Also disabled "Disable scaling if image size ..." and bumped compression artifact reduction to 14 (still medium quality) since it improves image quality with about 0 added cost

Last edited by bigboyman; 26th May 2019 at 20:22.
bigboyman is offline   Reply With Quote
Old 26th May 2019, 20:53   #56336  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
One more thing I thought I should mention is that for some reason, using a 720p source bumps frame render time to around double what I have in 480p. This relationship seems to be linear with increasing resolutions. Should there really be such a pronounced increase in frame render time?
bigboyman is offline   Reply With Quote
Old 26th May 2019, 21:41   #56337  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
Yes, 960x720 has exactly twice the number of pixels that 720x480 does. This does indicate that most of your performance cost is due to chroma upscaling, artifact removal, and/or image enhancements (the things before upscaling). Upscaling is also slower with larger input sizes but it scales with output resolution as well, so it wouldn't be 2x slower.

Quote:
Originally Posted by bigboyman View Post
As far as I've seen, 80ºC is the point at which the transition from 20ms to 60ms occurs. It doesn't really ever go above 85ºC.
Not sure what the significance of that is but it could be useful info.
This is the source of your performance issues.

Your laptop's cooling is not good enough so your GPU downclocks a lot to keep temps down. Is it too dusty or is a fan broken, or something? Have you checked clock speeds when this happens? You will need to keep temps at 80°C or below.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 27th May 2019, 00:41   #56338  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Quote:
Originally Posted by Asmodian View Post
...
This is the source of your performance issues.

Your laptop's cooling is not good enough so your GPU downclocks a lot to keep temps down. Is it too dusty or is a fan broken, or something? Have you checked clock speeds when this happens? You will need to keep temps at 80°C or below.
I'm trying a laptop cooler but it doesn't do much.
It could be dusty but I'd have to open it and check.

Thing is, nVIDIA website says the max operating speed is around 100ºC.
Not to mention, even relatively inactive, my GPU still hovers around 80ºC, so it's not like madVR makes the temp shoot up like crazy. Is the throtelling threshold lower than advertised?
bigboyman is offline   Reply With Quote
Old 27th May 2019, 01:34   #56339  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
just give us GPU clock speed and we will know if it is throttling.

if you get 89 C at idle your system isn't cooled properly.

laptop usual are not using normal maxwell cards but tweaked far slower version with a similar name and it's unlike they will allow 100 C the GPU may survive that but that doesn't the rest around it.

as far as i know the throttling temp for maxwell was around 80 where it will at least not boost anymore.
huhn is offline   Reply With Quote
Old 27th May 2019, 01:56   #56340  |  Link
bigboyman
Registered User
 
Join Date: May 2019
Posts: 19
Quote:
Originally Posted by huhn View Post
just give us GPU clock speed and we will know if it is throttling.

if you get 89 C at idle your system isn't cooled properly.

laptop usual are not using normal maxwell cards but tweaked far slower version with a similar name and it's unlike they will allow 100 C the GPU may survive that but that doesn't the rest around it.

as far as i know the throttling temp for maxwell was around 80 where it will at least not boost anymore.
It goes from about 1200 - 900MHz to 405MHz the instant I open a file on the player (unless it's the first time it got to high temps since boot). Temperature doesn't even seem related to the clock change. It could be 83ºC before and 78ºC after and it still does this. It will sit for as long as it feels like it at that clock speed (even after closing the player) before going back.

Last edited by bigboyman; 27th May 2019 at 15:42.
bigboyman is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 21:28.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.