Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 1st July 2015, 07:48   #31441  |  Link
RyuzakiL
Registered User
 
Join Date: May 2015
Posts: 36
MadVR on Nvidia GTX 980ti vs AMD Fury X

i hope we get some MadVR benchmark (using MAX NNEDI3 settings)
from guys who just purchased said Monsters hehe.

Hm.. i guess Fury X will be the better buy?
RyuzakiL is offline   Reply With Quote
Old 1st July 2015, 08:44   #31442  |  Link
Arm3nian
Registered User
 
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
Madshi you should write a madVR benchmark tool like SVP has. Then all the users can upload their results to a public spreadsheet. It would help troubleshoot performance problems by allowing comparisions and give a general rule of what to expect with a certain gpu/machine with different settings.
Arm3nian is offline   Reply With Quote
Old 1st July 2015, 09:15   #31443  |  Link
j5627429
Registered User
 
Join Date: May 2010
Posts: 27
Quote:
Originally Posted by RyuzakiL View Post
i hope we get some MadVR benchmark (using MAX NNEDI3 settings)
from guys who just purchased said Monsters hehe.
I think many of us at this point are interested in getting the most possible sharpness and quality they can with 1080P>4K upscaling, since the alternative is to simply output 1080p to a 4K display and use the display's built-in upscaling (or in my case, pseudo-4K contrast/detail enhancements on my 1080p JVC RS57 projector).

Most people using MadVR for this purpose are looking for a result that is substantially better than the native upscaling of the 4K display, and are willing to pay a bit more than the average HTPC user to get it, especially during this 2Kbluray>4Kbluray interim.

When you think about it, someone spending multiple thousands of dollars for a high quality 4K TV or projector can easily include the $650 of a 980ti into their purchase price if using it in conjunction with high quality MadVR settings gives them perceptibly "around 80 percent of true 4K source material" on good quality 1080p blurays.

I watched Oblivion bluray (lanczos max+AR for chroma and image upscale, with superres refinement) with 1080p and my projector's pseudo-4K processing set fairly aggressively and thought it looked good. After watching the same movie via direct 4K input to my RS57 and using modest MadVR settings, I won't go back to letting the display upscale. The observable difference in detail resolution was jaw-dropping.

With the latest release (thanks, Madshi!) my 1080>4K performance has magically increased and I am able to do image doubling/quadrupling on my AMD 7970, but I am hungry for more!

Regarding Fury X vs. 980ti, it seems like a no brainer. When spending $650 on a video card today, you probably want it to be HDMI2.0/HDCP 2.2 compatible to ensure it is future-proof. GTX960 and 980ti are, FuryX isn't. From what I've read recently, AMD's Fury is no different in 4K output format than my 3 year old 7970.
Personally, I'm trying to decide between the 960 and 980ti. Although 980ti does not have hardware h.265 decode for when we get 4K content, if it allows significantly higher quality settings for 1080>4K without any stuttering/glitches in playback, it is a sacrifice I'd be willing to make.

So yeah, this was an excessively long post to say "I'm also interested in others' MadVR results with 980ti on 4K screens!"

PS:
I have not posted here in years, and wanted to say kudos to Madshi and all the regulars here for all the hard work and perseverance that has continued to make this awesome project a reality!
PPS:
If there is another thread where 1080p>4K uspcaling using HQ madvr settings with the latest high-end video cards is being discussed, could someone please point the way? Thanks.

Last edited by j5627429; 1st July 2015 at 09:59.
j5627429 is offline   Reply With Quote
Old 1st July 2015, 09:51   #31444  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 7,920
only HDMI 2.0 can do 4:2:0.

the older HDMI 1.4 spec can do UHD 4:4:4 and RGB up to 30 HZ and if i'm not mistaken in more than 8 bit.
huhn is offline   Reply With Quote
Old 1st July 2015, 10:38   #31445  |  Link
j5627429
Registered User
 
Join Date: May 2010
Posts: 27
Quote:
Originally Posted by huhn View Post
only HDMI 2.0 can do 4:2:0.

the older HDMI 1.4 spec can do UHD 4:4:4 and RGB up to 30 HZ and if i'm not mistaken in more than 8 bit.
I think you're right. Maybe there is only the 30hz limitation and lack of HDCP2.2 that are the AMD deal-killers for future content, unless a displayport>HDMI 2.0 converter cable can fix everything and allow full output of the new 4K bluray specs in refresh, resolution and color.

With my 7970 and projector setup, the AMD driver only lets me use RGB 4:4:4 and 8-bit color over HDMI at UHD resolution. With 1080p, the option to do 10-bit and 12-bit are available, and with UHD, the higher bit depths are greyed out. The 8-bit limitation could just be my own display's 4K input limitation, though.
j5627429 is offline   Reply With Quote
Old 1st July 2015, 11:36   #31446  |  Link
RyuzakiL
Registered User
 
Join Date: May 2015
Posts: 36
@j5627429

Hm.. If i'm not mistaken if you have a powerful processor, you can count on software h.265 decode so i think 980ti is the best choice.

I'm leaning on Fury X because of the sheer amount of shader processors (good for compute performance), but unfortunately we all know that AMD drivers is crappy for Madvr (It can't run NNEDI3 properly even with Interop Hack).

But I'm not gonna loose hope for AMD, 2016 is coming and with it the 14/16nm goodness for AMD and also NVIDIA. So i'm gonna hold onto my 2X HD7850, SuperXBR seems evolving in every Madvr release so I bet @Madshi can squeeze further performance and quality from it.

P.S. 980ti or Fury X on MadVR still drives my curiosity

yeah I vote on an Exclusive MadVR benchmark so everyone can gauge how much quality/performance they can get from particular GPU's maybe even help others on purchase decisions
RyuzakiL is offline   Reply With Quote
Old 1st July 2015, 12:47   #31447  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by oddball View Post
Firstly. Turn off any overclocking to rule that out as an issue (I noticed you have your 980 OC'd). DVI or display port? Material you are trying to play? FPS and resolution? Did you reset MadVR to defaults and retry? Also if none of that works try forcing your monitor to 60Hz and retry. Also check your drivers and driver settings.
OC or not OC makes no difference. I use DisplayPort as it is the only way to get 120Hz on this monitor AFAIK. Content resolution - anything from 720p to 1080p and happens regardless of whether Doubling/Quadrupling is enabled. Refresh rate stays at 120Hz. I very much did reset madVR to defaults, the uninstalled, then reset defaults again, then installed it, then reset defaults again, and restarted in-between those.

I do not want 60Hz, I want 120Hz...

Last edited by XMonarchY; 1st July 2015 at 12:50.
  Reply With Quote
Old 1st July 2015, 12:49   #31448  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by QBhd View Post
Is it the same GPU powering both screens? I can't see any reason why this would happen. There must be something else going on in your system. Are you sure your are using a cable (HDMI, DVI, D-port) that supports speeds of 1080p120.

QB
Yes, but I turn off one of the screens when I use another. My GPU is always in single-display mode. I use DisplayPort as it is the only way to get 120Hz on this monitor AFAIK. Nothing else going on - Task Manager is clean of any 3rd party services/processes during video playback. I do not use overlay in Steam and I shut down Steam and ALL other apps before playback. I use the latest MPC-HC x86, the latest LAV decoders/filters, the latest ReClock, and the latest madVR.

Last edited by XMonarchY; 1st July 2015 at 12:52.
  Reply With Quote
Old 1st July 2015, 12:53   #31449  |  Link
yukinok25
Registered User
 
Join Date: May 2015
Posts: 17
Just wanted to say that the latest version is absolutely astonishing!
The image quality has improved visibly and I am using the same settings as always, no issues whatsoever.

Madshi is there a way I can donate something for this project? Do you need or accept donation?
yukinok25 is offline   Reply With Quote
Old 1st July 2015, 12:53   #31450  |  Link
XMonarchY
Guest
 
Posts: n/a
Quote:
Originally Posted by huhn View Post
only HDMI 2.0 can do 4:2:0.

the older HDMI 1.4 spec can do UHD 4:4:4 and RGB up to 30 HZ and if i'm not mistaken in more than 8 bit.
Does 4:2:0 require more bandwidth? Since 4:4:4 is the best option AFAIK, 4:2:0 requiring HDMI 2.0 is kind of backwards, isn't it?
  Reply With Quote
Old 1st July 2015, 13:10   #31451  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by XMonarchY View Post
Does 4:2:0 require more bandwidth? Since 4:4:4 is the best option AFAIK, 4:2:0 requiring HDMI 2.0 is kind of backwards, isn't it?
No it does not require more bandwidth.
It is kind of backwards...

HDMI 2.0 required for 4:2:0.
HDMI 1.4 does not support 4:2:0 directly, it (DVD/BR/PC etc...) will send 4:2:2 or 4:4:4 with the 4:2:0 data in it (just copies the missing date from the adjacent pixel).

https://en.wikipedia.org/wiki/HDMI#Version_comparison

But, if you are using madVR from a PC you should not care less, you should be using RGB anyway.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 1st July 2015 at 13:17.
James Freeman is offline   Reply With Quote
Old 1st July 2015, 13:24   #31452  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by Arm3nian View Post
Madshi you should write a madVR benchmark tool like SVP has. Then all the users can upload their results to a public spreadsheet. It would help troubleshoot performance problems by allowing comparisions and give a general rule of what to expect with a certain gpu/machine with different settings.
I think a benchmark program would be a waste of time. It would tell speed under certian conditions, but with so many options/variables avaialbe in madVR no two people are likely to be comparing under the exact same settings. Additionally, some systems would be able to support some settings, but not others. A lot of system would likely either crash themselves or crash the benchmark program trying to run settings that are either incompatible or too high.

MadVR is mainly about getting the best image quality while not straining the system from the power/performance/longevity of system components side. A benchmark program isn't going to tell us what looks best on our systems (what we've been testing, and reporting all these features under controlled conditions for all this time).
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 1st July 2015, 13:35   #31453  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Why do you need a benchmark program? As stated by others, settings are individual thing per user.

Just keep the rendering time under frame time.
1/"frame time" = maximum rendering time before dropped frames.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.

Last edited by James Freeman; 1st July 2015 at 13:38.
James Freeman is offline   Reply With Quote
Old 1st July 2015, 13:39   #31454  |  Link
QBhd
QB the Slayer
 
QBhd's Avatar
 
Join Date: Feb 2011
Location: Toronto
Posts: 697
Quote:
Originally Posted by Anime Viewer View Post
MadVR is mainly about getting the best image quality while not straining the system from the power/performance/longevity of system components side.
I push my system to the max with everything I watch. It's there to be used, I don't pay for electricity and I get the best PQ possible from my system. So the above statement is not very accurate, since I am sure I am not the only one squeezing out every drop of performance.

And yes a benchmark is pretty much useless.

QB
__________________
QBhd is offline   Reply With Quote
Old 1st July 2015, 13:51   #31455  |  Link
James Freeman
Registered User
 
Join Date: Sep 2013
Posts: 919
Quote:
Originally Posted by QBhd View Post
I don't pay for electricity
Care to elaborate?...

The difference between 20W and 1000W system is VERY significant.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410.
James Freeman is offline   Reply With Quote
Old 1st July 2015, 14:07   #31456  |  Link
noee
Registered User
 
Join Date: Jan 2007
Posts: 530
Quote:
Originally Posted by QBhd View Post
I don't pay for electricity...
QB
Yeah, I'd like know this too. Even my solar/battery set up costs me in terms of battery/panel maintenance and replacement, nevermind losses in the DC-AC inverter. The proper batteries and controller(s) are not cheap.

What I've found with madVR, is that I can use the CPU for decoding all sources and max my GPU(s) for the PQ and this seems to give reasonable power draw with fantastic PQ.
__________________
Win7Ult || RX560/4G || Ryzen 5
noee is offline   Reply With Quote
Old 1st July 2015, 14:14   #31457  |  Link
Anime Viewer
Troubleshooter
 
Anime Viewer's Avatar
 
Join Date: Feb 2014
Posts: 339
Quote:
Originally Posted by QBhd View Post
I push my system to the max with everything I watch. It's there to be used, I don't pay for electricity and I get the best PQ possible from my system.

QB
If you're running with your GPU at full load with the fan(s) at full bore, and the render times on the edge of being over your vsync interval then you're not likely getting the best picture quality (not all the time). Just because you're using a performance draining setting doesn't mean you'll get the best quality. Different options give different image output. While a video of real life people/environments may look good with Jinc other content (with drawn lines) may look better with Mitchell-Netavali. Some of the more taxing settings can produce artifacts, ringing, or aliasing. The second a scene gets more taxing than what your system can normally handle (be it with lighting effects, panning scenes, fancy subtitles, etc) you'll get a dropped frame or a presentation glitch and your picture/video will not be so smooth and fluid any longer.
Running your system at full throttle is also going to wear your hardware components faster than someone who runs things under control, so you're component's/system's lifespan will be shorter.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV.
Anime Viewer is offline   Reply With Quote
Old 1st July 2015, 14:15   #31458  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,346
Quote:
Originally Posted by XMonarchY View Post
Does 4:2:0 require more bandwidth? Since 4:4:4 is the best option AFAIK, 4:2:0 requiring HDMI 2.0 is kind of backwards, isn't it?
Its not backwards. It doesn't require more bandwith, but previous versions of HDMI just didn't specify 4:2:0 support. HDMI 2.0 does specify it, so HDMI 2.0 is required.

However, because it doesn't need more bandwidth, NVIDIA GPUs can do this on HDMI 1.4 transmitters - assuming the display supports HDMI 2.0 and understands the signal.

Have to make a distinction between the HDMI transmitter speed, and the protocol version. A HDMI 1.4 transmitter can be upgraded via software to get HDMI 2.0 features like 4:2:0 .. but software cannot add more bandwidth.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders
nevcairiel is offline   Reply With Quote
Old 1st July 2015, 14:33   #31459  |  Link
j5627429
Registered User
 
Join Date: May 2010
Posts: 27
Quote:
Originally Posted by noee View Post
max my GPU(s) for the PQ
Does MadVR actually take advantage of crossfire/sli?
j5627429 is offline   Reply With Quote
Old 1st July 2015, 14:35   #31460  |  Link
noee
Registered User
 
Join Date: Jan 2007
Posts: 530
Quote:
Originally Posted by j5627429 View Post
Does MadVR actually take advantage of crossfire/sli?
sorry, meant different gpus in different machines....I have no experience with madVR in either x-fire or sli.
__________________
Win7Ult || RX560/4G || Ryzen 5
noee is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 08:18.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.