Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 18th November 2018, 05:45   #53681  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
I overclock a lot, on a 2080 Ti now, but ideally I de-volt for madVR. I don't need the full power for my ideal settings so running lower volts/clocks saves power/heat. I have never had issues damaging a video card by overclocking but madVR is one of the most intensive things you can run, if running close to 100% GPU usage, so make sure cooling is sufficient.

100% usage can use very different amounts of power with different loads, madVR is not Furmark but it takes more power than most games.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th November 2018, 06:07   #53682  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Kinda silly not to overclock the memory at the least. If you don't need the extra performance to unlock a particular option then just leave things as they are unless you're gaming also.
ryrynz is offline   Reply With Quote
Old 18th November 2018, 09:38   #53683  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
True, madVR really likes a memory overclock too.
__________________
madVR options explained
Asmodian is offline   Reply With Quote
Old 18th November 2018, 12:58   #53684  |  Link
Siso
Soul Seeker
 
Siso's Avatar
 
Join Date: Sep 2013
Posts: 711
Quote:
Originally Posted by Asmodian View Post
True, madVR really likes a memory overclock too.
Why is that?
Siso is offline   Reply With Quote
Old 18th November 2018, 20:07   #53685  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
It gets a lot of rendering performance with a memory overclock, some workloads get faster with a memory overclock and some don't. madVR gets a lot faster with memory speed, at least on Maxwell, Pascal, and Turing.

Edit: I think it is the improved latency that is the benefit, not bandwidth, because it has a similar effect on GPU's with vastly different amounts of memory bandwidth.
__________________
madVR options explained

Last edited by Asmodian; 18th November 2018 at 21:35.
Asmodian is offline   Reply With Quote
Old 18th November 2018, 21:26   #53686  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
I noticed I could use higher madVR settings when I overclocked my nVidia. I was using EVGA Precision and my card is EVGA fwiw. Problem is, the overclock settings do not survive a reboot and I got sick of re-overclocking constantly. Took a while to figure out why high madVR settings worked one day but not the next too. What the heck were they thinking when designing that software? What do you guys use to overclock nVidia that survives a reboot? TIA.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 18th November 2018, 22:18   #53687  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
MSI Afterburner has an option to load the overclock on reboot, works fine for me. I would be surprised if the EVGA thing didn't also have that option somewhere.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 18th November 2018, 23:37   #53688  |  Link
Asmodian
Registered User
 
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,406
MSI Afterburner works on EVGA cards too, and I know it has a little button that applies the overclock after a reboot. I am almost certain EVGA Precision also has this option but I like Afterburner a lot more.

Edit: I had forgotten to hit post and agree with nevcairiel.
__________________
madVR options explained

Last edited by Asmodian; 19th November 2018 at 01:14.
Asmodian is offline   Reply With Quote
Old 19th November 2018, 02:08   #53689  |  Link
brazen1
Registered User
 
Join Date: Oct 2017
Posts: 331
Installed MSI AB and overclocked GPU and memory. Survives a reboot. Thanks guys. Upped some madVR settings. So far, so good. I'll be perfectly honest. For me, it always takes time to make sure upped settings work for a wide range of titles. Right now I've only upped 2160p and checked one of my larger RIPs - The Hunger Games. I went from Scaling chroma AA med to NGU Sharp high and played for 10mins. Didn't miss a beat and rendering time is within spec. I know the jump is drastic but I had it at AA med to quite down CPU fan, not because it struggled to decode. Now CPU fan is quiet with higher setting. I guess clocking the GPU up takes some pressure off the CPU? I'll know in time if I exceeded my settings when used with other RIPs. Hope not. Anyway, pretty happy. Thanks for the reco.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players
W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit
KODI 22 MPC-HC/BE 82" Q90R Denon S720W
brazen1 is offline   Reply With Quote
Old 19th November 2018, 03:33   #53690  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,646
Should be negligable CPU difference between those two.
Also NGU sharp isn't exactly an upgrade for chroma, compare first.

Last edited by ryrynz; 19th November 2018 at 03:35.
ryrynz is offline   Reply With Quote
Old 19th November 2018, 05:21   #53691  |  Link
braddock
Registered User
 
Join Date: May 2014
Posts: 8
I created a custom 2160p 23.976 refresh rate for my gtx 1080 using the madvr optimizer. It works great in MPC-HC. However when I use this refresh rate with Kodi 18 or PowerDVD (I use these programs for some of my BD isos for full menu access) this refresh rate is noticeably stuttering every few seconds. Any idea why that would be happening?
braddock is offline   Reply With Quote
Old 19th November 2018, 18:02   #53692  |  Link
Warner306
Registered User
 
Join Date: Dec 2014
Posts: 1,127
Post an image of the display modes menu in the control panel. Something could be amiss with your rules.
Warner306 is offline   Reply With Quote
Old 19th November 2018, 18:16   #53693  |  Link
COOLak
Registered User
 
Join Date: Sep 2012
Posts: 14
Quote:
Originally Posted by Warner306 View Post
Post an image of the display modes menu in the control panel. Something could be amiss with your rules.
Sorry, figured that myself, so I deleted that post. The wrong device was selected as my TV. Previously it wouldn't show me the right device in the device list, but now it did show it for the first time. The refresh rate change feature now works for me.

--------

However, I have another question. I have thousands of both dropped and repeated frames and presentation glitches (the amount of those varies depending on the video), but I don't see any actual problems at all. Render times are OK. The refresh rate matches video. Should I worry about that?
There are no presentation glitches in the exclusive mode, but I don't like it, so I guess it's fine that I have thousands of presentation glitches if it doesn't affect my perception of the material in any way?
As for dropped/repeated frames, there is equal amount of them when I watch a 1920x1080 29.970 fps interlaced video. Also not affecting performance in any way.

Last edited by COOLak; 19th November 2018 at 18:33.
COOLak is offline   Reply With Quote
Old 20th November 2018, 05:59   #53694  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
what do you under stand under "render times are ok"
huhn is offline   Reply With Quote
Old 20th November 2018, 21:32   #53695  |  Link
jmonier
Registered User
 
Join Date: Oct 2008
Posts: 187
I have a problem with activating HDR via DisplayPort. This is with a LG 34BK95U - 5120x2160 which requires DP 1.4 to get the full width. HDMI (which limits it to 3840 width) works fine.

Basically, it seems that madVR does not recognize that HDR is supported via DP, and thus always does the HDR-to-SDR conversion. (And I have checked that HDR can be activated otherwise via DP.)

This is a heads up to anyone else who has or is thinking of getting this monitor (note that the LG 34WK95U is exactly the same).

I'll be generating a bug report after I gather some more data.

EDIT: I should mention that I have a Nvidia card. AMD could be totally different.

Last edited by jmonier; 20th November 2018 at 22:54.
jmonier is offline   Reply With Quote
Old 21st November 2018, 04:27   #53696  |  Link
Olivier C.
Registered User
 
Join Date: Jan 2014
Location: France
Posts: 76
IMadVRSettings SettingsGetBinary / SettingsSetBinary

Hi madshi,

We can call IMadVRSettings::SettingsGetBinary in order to get the field value "displaymodesdata" (type : binary) but there isn't any SettingsSetBinary method to set it.

Could you add it in a near future ?

Code:
Set methods :
  STDMETHOD_(BOOL, SettingsSetString )(LPCWSTR path, LPCWSTR value) = 0;
  STDMETHOD_(BOOL, SettingsSetInteger)(LPCWSTR path, int     value) = 0;
  STDMETHOD_(BOOL, SettingsSetBoolean)(LPCWSTR path, BOOL    value) = 0;

Get methods :
 STDMETHOD_(BOOL, SettingsGetString )(LPCWSTR path, LPCWSTR value, int* bufLenInChars) = 0;
  STDMETHOD_(BOOL, SettingsGetInteger)(LPCWSTR path, int*    value) = 0;
  STDMETHOD_(BOOL, SettingsGetBoolean)(LPCWSTR path, BOOL*   value) = 0;
  STDMETHOD_(BOOL, SettingsGetBinary )(LPCWSTR path, LPVOID* value, int* bufLenInBytes) = 0;
Thanks a lot
Olivier C. is offline   Reply With Quote
Old 21st November 2018, 19:21   #53697  |  Link
yiandev
Registered User
 
Join Date: Dec 2017
Posts: 1
random questions are stupid mods remove them

Hi how can i set a manual renderer device for madvr? why is it automatically using the GPU to which the monitor player is located on.
I have a FHD TV connected to hdmi to a gt 430 because my main GPU rx580 has some issues running my monitor and my TV so i use a gt430 for TV, but when i use potplayer and move it to the TV madvr renderer is using gt 430 for rendering and it drops lots of frames since gt 430 is too weak gpu.
How can i set my main gpu RX 580 to render even when the player in located on a TV/monitor connected to a different GPU(gt430)?
I tried setting LAV video decoder to RX 580 instead of automatic but madvr still renders on gt430

Please madvr developer create an option to allow us GPU selection, dont render automatically on the gpu the screen is connected on thats just please no, dont do that.
yiandev is offline   Reply With Quote
Old 21st November 2018, 19:25   #53698  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
Rendering on any other GPU then the one connected to the screen would be extremely bad for performance, and extremely niche of a feature. Don't hold your breath.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 22nd November 2018, 09:25   #53699  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,903
you can already use a different GPU to render the on your screen. it's a default feature of windows 10.

you go to settings-> display-> graphics settings -> browse ->add your player

now you go to options and select high performance which should be your AMD card.

in my very limited experiences this works great with intel as the renderer and nvidia as the presentation GPU and totally terrible with nvidia as the renderer and intel as the presentation GPU.

you may run into vsync issues like tearing, PCIe bandwidth limitations, sync issues A/V and "stuff"...

and you may need to change the registry and disable the IGPU if present completely.
huhn is offline   Reply With Quote
Old 22nd November 2018, 09:37   #53700  |  Link
svengun
Registered User
 
Join Date: Jan 2018
Location: Barcelona
Posts: 50
Quote:
Originally Posted by huhn View Post
driver and windows version have nothing todo with that.

what is your end device and is a AVR in between?
What would it matter , if there was an AVR in between. My Marantz SR6010 has a 4:4:4 UHD bypass, would there still be a sgnal or quality loss to be expected ?
svengun is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:40.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.