Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
18th November 2018, 05:45 | #53681 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
I overclock a lot, on a 2080 Ti now, but ideally I de-volt for madVR. I don't need the full power for my ideal settings so running lower volts/clocks saves power/heat. I have never had issues damaging a video card by overclocking but madVR is one of the most intensive things you can run, if running close to 100% GPU usage, so make sure cooling is sufficient.
100% usage can use very different amounts of power with different loads, madVR is not Furmark but it takes more power than most games.
__________________
madVR options explained |
18th November 2018, 09:38 | #53683 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
True, madVR really likes a memory overclock too.
__________________
madVR options explained |
18th November 2018, 20:07 | #53685 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
It gets a lot of rendering performance with a memory overclock, some workloads get faster with a memory overclock and some don't. madVR gets a lot faster with memory speed, at least on Maxwell, Pascal, and Turing.
Edit: I think it is the improved latency that is the benefit, not bandwidth, because it has a similar effect on GPU's with vastly different amounts of memory bandwidth.
__________________
madVR options explained Last edited by Asmodian; 18th November 2018 at 21:35. |
18th November 2018, 21:26 | #53686 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
I noticed I could use higher madVR settings when I overclocked my nVidia. I was using EVGA Precision and my card is EVGA fwiw. Problem is, the overclock settings do not survive a reboot and I got sick of re-overclocking constantly. Took a while to figure out why high madVR settings worked one day but not the next too. What the heck were they thinking when designing that software? What do you guys use to overclock nVidia that survives a reboot? TIA.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W |
18th November 2018, 22:18 | #53687 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
MSI Afterburner has an option to load the overclock on reboot, works fine for me. I would be surprised if the EVGA thing didn't also have that option somewhere.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
18th November 2018, 23:37 | #53688 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
MSI Afterburner works on EVGA cards too, and I know it has a little button that applies the overclock after a reboot. I am almost certain EVGA Precision also has this option but I like Afterburner a lot more.
Edit: I had forgotten to hit post and agree with nevcairiel.
__________________
madVR options explained Last edited by Asmodian; 19th November 2018 at 01:14. |
19th November 2018, 02:08 | #53689 | Link |
Registered User
Join Date: Oct 2017
Posts: 331
|
Installed MSI AB and overclocked GPU and memory. Survives a reboot. Thanks guys. Upped some madVR settings. So far, so good. I'll be perfectly honest. For me, it always takes time to make sure upped settings work for a wide range of titles. Right now I've only upped 2160p and checked one of my larger RIPs - The Hunger Games. I went from Scaling chroma AA med to NGU Sharp high and played for 10mins. Didn't miss a beat and rendering time is within spec. I know the jump is drastic but I had it at AA med to quite down CPU fan, not because it struggled to decode. Now CPU fan is quiet with higher setting. I guess clocking the GPU up takes some pressure off the CPU? I'll know in time if I exceeded my settings when used with other RIPs. Hope not. Anyway, pretty happy. Thanks for the reco.
__________________
HOW TO-Kodi 2D-3D-UHD (4k) HDR Guide Internal & External Players W11 Pro 24H2 GTX960-4GB RGB 4:4:4 @Matched Refresh Rates 8,10,12bit KODI 22 MPC-HC/BE 82" Q90R Denon S720W |
19th November 2018, 05:21 | #53691 | Link |
Registered User
Join Date: May 2014
Posts: 8
|
I created a custom 2160p 23.976 refresh rate for my gtx 1080 using the madvr optimizer. It works great in MPC-HC. However when I use this refresh rate with Kodi 18 or PowerDVD (I use these programs for some of my BD isos for full menu access) this refresh rate is noticeably stuttering every few seconds. Any idea why that would be happening?
|
19th November 2018, 18:02 | #53692 | Link |
Registered User
Join Date: Dec 2014
Posts: 1,127
|
Post an image of the display modes menu in the control panel. Something could be amiss with your rules.
__________________
HOW TO - Set up madVR for Kodi DSPlayer & External Media Players |
19th November 2018, 18:16 | #53693 | Link | |
Registered User
Join Date: Sep 2012
Posts: 14
|
Quote:
-------- However, I have another question. I have thousands of both dropped and repeated frames and presentation glitches (the amount of those varies depending on the video), but I don't see any actual problems at all. Render times are OK. The refresh rate matches video. Should I worry about that? There are no presentation glitches in the exclusive mode, but I don't like it, so I guess it's fine that I have thousands of presentation glitches if it doesn't affect my perception of the material in any way? As for dropped/repeated frames, there is equal amount of them when I watch a 1920x1080 29.970 fps interlaced video. Also not affecting performance in any way. Last edited by COOLak; 19th November 2018 at 18:33. |
|
20th November 2018, 21:32 | #53695 | Link |
Registered User
Join Date: Oct 2008
Posts: 187
|
I have a problem with activating HDR via DisplayPort. This is with a LG 34BK95U - 5120x2160 which requires DP 1.4 to get the full width. HDMI (which limits it to 3840 width) works fine.
Basically, it seems that madVR does not recognize that HDR is supported via DP, and thus always does the HDR-to-SDR conversion. (And I have checked that HDR can be activated otherwise via DP.) This is a heads up to anyone else who has or is thinking of getting this monitor (note that the LG 34WK95U is exactly the same). I'll be generating a bug report after I gather some more data. EDIT: I should mention that I have a Nvidia card. AMD could be totally different. Last edited by jmonier; 20th November 2018 at 22:54. |
21st November 2018, 04:27 | #53696 | Link |
Registered User
Join Date: Jan 2014
Location: France
Posts: 76
|
IMadVRSettings SettingsGetBinary / SettingsSetBinary
Hi madshi,
We can call IMadVRSettings::SettingsGetBinary in order to get the field value "displaymodesdata" (type : binary) but there isn't any SettingsSetBinary method to set it. Could you add it in a near future ? Code:
Set methods : STDMETHOD_(BOOL, SettingsSetString )(LPCWSTR path, LPCWSTR value) = 0; STDMETHOD_(BOOL, SettingsSetInteger)(LPCWSTR path, int value) = 0; STDMETHOD_(BOOL, SettingsSetBoolean)(LPCWSTR path, BOOL value) = 0; Get methods : STDMETHOD_(BOOL, SettingsGetString )(LPCWSTR path, LPCWSTR value, int* bufLenInChars) = 0; STDMETHOD_(BOOL, SettingsGetInteger)(LPCWSTR path, int* value) = 0; STDMETHOD_(BOOL, SettingsGetBoolean)(LPCWSTR path, BOOL* value) = 0; STDMETHOD_(BOOL, SettingsGetBinary )(LPCWSTR path, LPVOID* value, int* bufLenInBytes) = 0; |
21st November 2018, 19:21 | #53697 | Link |
Registered User
Join Date: Dec 2017
Posts: 1
|
random questions are stupid mods remove them
Hi how can i set a manual renderer device for madvr? why is it automatically using the GPU to which the monitor player is located on.
I have a FHD TV connected to hdmi to a gt 430 because my main GPU rx580 has some issues running my monitor and my TV so i use a gt430 for TV, but when i use potplayer and move it to the TV madvr renderer is using gt 430 for rendering and it drops lots of frames since gt 430 is too weak gpu. How can i set my main gpu RX 580 to render even when the player in located on a TV/monitor connected to a different GPU(gt430)? I tried setting LAV video decoder to RX 580 instead of automatic but madvr still renders on gt430 Please madvr developer create an option to allow us GPU selection, dont render automatically on the gpu the screen is connected on thats just please no, dont do that. |
21st November 2018, 19:25 | #53698 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
|
Rendering on any other GPU then the one connected to the screen would be extremely bad for performance, and extremely niche of a feature. Don't hold your breath.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
22nd November 2018, 09:25 | #53699 | Link |
Registered User
Join Date: Oct 2012
Posts: 7,925
|
you can already use a different GPU to render the on your screen. it's a default feature of windows 10.
you go to settings-> display-> graphics settings -> browse ->add your player now you go to options and select high performance which should be your AMD card. in my very limited experiences this works great with intel as the renderer and nvidia as the presentation GPU and totally terrible with nvidia as the renderer and intel as the presentation GPU. you may run into vsync issues like tearing, PCIe bandwidth limitations, sync issues A/V and "stuff"... and you may need to change the registry and disable the IGPU if present completely. |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|