Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
1st July 2015, 08:44 | #31442 | Link |
Registered User
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
|
Madshi you should write a madVR benchmark tool like SVP has. Then all the users can upload their results to a public spreadsheet. It would help troubleshoot performance problems by allowing comparisions and give a general rule of what to expect with a certain gpu/machine with different settings.
|
1st July 2015, 09:15 | #31443 | Link | |
Registered User
Join Date: May 2010
Posts: 27
|
Quote:
Most people using MadVR for this purpose are looking for a result that is substantially better than the native upscaling of the 4K display, and are willing to pay a bit more than the average HTPC user to get it, especially during this 2Kbluray>4Kbluray interim. When you think about it, someone spending multiple thousands of dollars for a high quality 4K TV or projector can easily include the $650 of a 980ti into their purchase price if using it in conjunction with high quality MadVR settings gives them perceptibly "around 80 percent of true 4K source material" on good quality 1080p blurays. I watched Oblivion bluray (lanczos max+AR for chroma and image upscale, with superres refinement) with 1080p and my projector's pseudo-4K processing set fairly aggressively and thought it looked good. After watching the same movie via direct 4K input to my RS57 and using modest MadVR settings, I won't go back to letting the display upscale. The observable difference in detail resolution was jaw-dropping. With the latest release (thanks, Madshi!) my 1080>4K performance has magically increased and I am able to do image doubling/quadrupling on my AMD 7970, but I am hungry for more! Regarding Fury X vs. 980ti, it seems like a no brainer. When spending $650 on a video card today, you probably want it to be HDMI2.0/HDCP 2.2 compatible to ensure it is future-proof. GTX960 and 980ti are, FuryX isn't. From what I've read recently, AMD's Fury is no different in 4K output format than my 3 year old 7970. Personally, I'm trying to decide between the 960 and 980ti. Although 980ti does not have hardware h.265 decode for when we get 4K content, if it allows significantly higher quality settings for 1080>4K without any stuttering/glitches in playback, it is a sacrifice I'd be willing to make. So yeah, this was an excessively long post to say "I'm also interested in others' MadVR results with 980ti on 4K screens!" PS: I have not posted here in years, and wanted to say kudos to Madshi and all the regulars here for all the hard work and perseverance that has continued to make this awesome project a reality! PPS: If there is another thread where 1080p>4K uspcaling using HQ madvr settings with the latest high-end video cards is being discussed, could someone please point the way? Thanks. Last edited by j5627429; 1st July 2015 at 09:59. |
|
1st July 2015, 10:38 | #31445 | Link | |
Registered User
Join Date: May 2010
Posts: 27
|
Quote:
With my 7970 and projector setup, the AMD driver only lets me use RGB 4:4:4 and 8-bit color over HDMI at UHD resolution. With 1080p, the option to do 10-bit and 12-bit are available, and with UHD, the higher bit depths are greyed out. The 8-bit limitation could just be my own display's 4K input limitation, though. |
|
1st July 2015, 11:36 | #31446 | Link |
Registered User
Join Date: May 2015
Posts: 36
|
@j5627429
Hm.. If i'm not mistaken if you have a powerful processor, you can count on software h.265 decode so i think 980ti is the best choice. I'm leaning on Fury X because of the sheer amount of shader processors (good for compute performance), but unfortunately we all know that AMD drivers is crappy for Madvr (It can't run NNEDI3 properly even with Interop Hack). But I'm not gonna loose hope for AMD, 2016 is coming and with it the 14/16nm goodness for AMD and also NVIDIA. So i'm gonna hold onto my 2X HD7850, SuperXBR seems evolving in every Madvr release so I bet @Madshi can squeeze further performance and quality from it. P.S. 980ti or Fury X on MadVR still drives my curiosity yeah I vote on an Exclusive MadVR benchmark so everyone can gauge how much quality/performance they can get from particular GPU's maybe even help others on purchase decisions |
1st July 2015, 12:47 | #31447 | Link | |
Guest
Posts: n/a
|
Quote:
I do not want 60Hz, I want 120Hz... Last edited by XMonarchY; 1st July 2015 at 12:50. |
|
1st July 2015, 12:49 | #31448 | Link |
Guest
Posts: n/a
|
Yes, but I turn off one of the screens when I use another. My GPU is always in single-display mode. I use DisplayPort as it is the only way to get 120Hz on this monitor AFAIK. Nothing else going on - Task Manager is clean of any 3rd party services/processes during video playback. I do not use overlay in Steam and I shut down Steam and ALL other apps before playback. I use the latest MPC-HC x86, the latest LAV decoders/filters, the latest ReClock, and the latest madVR.
Last edited by XMonarchY; 1st July 2015 at 12:52. |
1st July 2015, 12:53 | #31449 | Link |
Registered User
Join Date: May 2015
Posts: 17
|
Just wanted to say that the latest version is absolutely astonishing!
The image quality has improved visibly and I am using the same settings as always, no issues whatsoever. Madshi is there a way I can donate something for this project? Do you need or accept donation? |
1st July 2015, 13:10 | #31451 | Link | |
Registered User
Join Date: Sep 2013
Posts: 919
|
Quote:
It is kind of backwards... HDMI 2.0 required for 4:2:0. HDMI 1.4 does not support 4:2:0 directly, it (DVD/BR/PC etc...) will send 4:2:2 or 4:4:4 with the 4:2:0 data in it (just copies the missing date from the adjacent pixel). https://en.wikipedia.org/wiki/HDMI#Version_comparison But, if you are using madVR from a PC you should not care less, you should be using RGB anyway.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410. Last edited by James Freeman; 1st July 2015 at 13:17. |
|
1st July 2015, 13:24 | #31452 | Link | |
Troubleshooter
Join Date: Feb 2014
Posts: 339
|
Quote:
MadVR is mainly about getting the best image quality while not straining the system from the power/performance/longevity of system components side. A benchmark program isn't going to tell us what looks best on our systems (what we've been testing, and reporting all these features under controlled conditions for all this time).
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV. |
|
1st July 2015, 13:35 | #31453 | Link |
Registered User
Join Date: Sep 2013
Posts: 919
|
Why do you need a benchmark program? As stated by others, settings are individual thing per user.
Just keep the rendering time under frame time. 1/"frame time" = maximum rendering time before dropped frames.
__________________
System: i7 3770K, GTX660, Win7 64bit, Panasonic ST60, Dell U2410. Last edited by James Freeman; 1st July 2015 at 13:38. |
1st July 2015, 13:39 | #31454 | Link | |
QB the Slayer
Join Date: Feb 2011
Location: Toronto
Posts: 697
|
Quote:
And yes a benchmark is pretty much useless. QB
__________________
|
|
1st July 2015, 14:07 | #31456 | Link |
Registered User
Join Date: Jan 2007
Posts: 530
|
Yeah, I'd like know this too. Even my solar/battery set up costs me in terms of battery/panel maintenance and replacement, nevermind losses in the DC-AC inverter. The proper batteries and controller(s) are not cheap.
What I've found with madVR, is that I can use the CPU for decoding all sources and max my GPU(s) for the PQ and this seems to give reasonable power draw with fantastic PQ.
__________________
Win7Ult || RX560/4G || Ryzen 5 |
1st July 2015, 14:14 | #31457 | Link | |
Troubleshooter
Join Date: Feb 2014
Posts: 339
|
Quote:
Running your system at full throttle is also going to wear your hardware components faster than someone who runs things under control, so you're component's/system's lifespan will be shorter.
__________________
System specs: Sager NP9150 SE with i7-3630QM 2.40GHz, 16 GB RAM, 64-bit Windows 10 Pro, NVidia GTX 680M/Intel 4000 HD optimus dual GPU system. Video viewed on LG notebook screen and LG 3D passive TV. |
|
1st July 2015, 14:15 | #31458 | Link | |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,347
|
Quote:
However, because it doesn't need more bandwidth, NVIDIA GPUs can do this on HDMI 1.4 transmitters - assuming the display supports HDMI 2.0 and understands the signal. Have to make a distinction between the HDMI transmitter speed, and the protocol version. A HDMI 1.4 transmitter can be upgraded via software to get HDMI 2.0 features like 4:2:0 .. but software cannot add more bandwidth.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
|
|