Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
1st July 2015, 14:52 | #31461 | Link | |
Registered User
Join Date: Dec 2008
Posts: 496
|
Quote:
So it depends what you want. If you are fine with 4K and max. 30fps, Fury X should be just fine for you. So it really depends on your use-cases for it. We would need a comparison with Fury X and a 980 Ti when we're talking madVR, since game benches can be misleading. I am not entirely sure about 980Ti's HDCP 2.2 support, since several sites claimed that only the GTX960 supports HDMI 2.0 as well as HDCP 2.2. Last edited by iSunrise; 1st July 2015 at 14:56. |
|
1st July 2015, 16:24 | #31462 | Link |
Registered User
Join Date: Dec 2011
Posts: 1,812
|
In benchmarks, Fury X oddly seems to be hardcore bandwidth limited: more core clock scales hardly, but more HBM clock gives a significant boost.
There's currently also the beeping tone which annoys many people. The card is probably also limited with NNEDI3 by the interop copyback of the catalyst driver. Well, its VPU is superior to GM200, but there is a whole lot of drawbacks otherwise. |
1st July 2015, 16:42 | #31463 | Link | |||
QB the Slayer
Join Date: Feb 2011
Location: Toronto
Posts: 697
|
Quote:
Quote:
It's simple, electricity is included in my rent so I can use as much as I want. Quote:
As to longevity of components... Computers are made to be used and periodic bursts of +90% usage is not going to shorten the lifespan by any significant amount. Turning your computer off after using it will shorten it's life far quicker than pushing it to it's limits. In 15+ years of running my PC's 24/7 (while gaming and encoding and now using madVR) I have had just one component fail. And that was a very old motherboard who's old style capacitors finally blew up due to extreme old age. QB
__________________
|
|||
1st July 2015, 16:49 | #31464 | Link | |
Registered User
Join Date: Aug 2008
Posts: 343
|
Quote:
That fits my idea too. Threre are so many different videos that can broke our settings and play with stuttering because our GPU is slow. What about benchmark AND then Adaptive Scaling Settings based on that benchmark. Let our graphic card use best scaling algorithm depends of its power. Let avoid situations of 4k or 60fps video makes video unwatchable. If Madvr automatically change to faster scaling, user will be happy anyway. This can make Madvr more user friendly than any time. |
|
1st July 2015, 17:32 | #31467 | Link | |
Registered User
Join Date: Jun 2015
Posts: 3
|
Quote:
|
|
1st July 2015, 17:44 | #31468 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
No, you are correct.
__________________
madVR options explained |
1st July 2015, 17:51 | #31469 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
madVR hates crossfire/sli. It runs slower with it enabled.
__________________
madVR options explained |
1st July 2015, 18:56 | #31470 | Link | |
Guest
Posts: n/a
|
Quote:
Last edited by XMonarchY; 1st July 2015 at 19:06. |
|
1st July 2015, 19:06 | #31471 | Link | ||
Registered User
Join Date: Oct 2012
Posts: 7,923
|
Quote:
Cb Cr just have an even lower resolution with 420 but you ware talking about bandwidth didn't you? EDIT: Quote:
HDMI 2.0 can't even do 10 bit UHD 4:4:4 at 60 hz. it just terrible. of cause DP 1.2 can do that and it is way way older. Last edited by huhn; 1st July 2015 at 19:08. |
||
1st July 2015, 19:16 | #31472 | Link |
/人 ◕ ‿‿ ◕ 人\
Join Date: May 2011
Location: Russia
Posts: 643
|
To deliver video as it is without wasting bandwidth (of course this is only true when you're using some standalone player, not PC with madVR - which can upsample chroma and do other stuff better than TV).
Last edited by vivan; 1st July 2015 at 19:19. |
1st July 2015, 20:20 | #31473 | Link | |||
Registered User
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
|
Quote:
Quote:
Quote:
|
|||
1st July 2015, 21:41 | #31474 | Link |
Registered User
Join Date: Feb 2002
Location: San Jose, California
Posts: 4,407
|
How would a benchmark differ from simply trying it? In your examples you don't need a benchmark tool (what would it do?), set the setting and see what the rendering time is. The complexity of the possible settings is the only issue and if you pick what "128 neuron doubling" means for all the other settings, source resolution, destination resolution, etc. it is easy to "benchmark" any GPU you own at "128 neuron doubling".
Like any GPU benchmark the results are highly dependent on the settings used and different users/sites use different settings so it is hard to compare results between users. Maybe a tool that played a stock video using a collection preset options with a particular player and at a specific resolution? It could report the average/min/max rendering times at each setting. This would be easy to do now, no special tool needed, but maybe review sites would include madVR performance if such a tool existed. However, it sounds like a lot of work, certainly more work than the benefit justifies before version 1.0 of madVR. madVR is changing fast so it isn't time for standard benchmark tools yet.
__________________
madVR options explained |
1st July 2015, 22:04 | #31476 | Link | ||
Registered User
Join Date: Jul 2014
Location: Las Vegas
Posts: 177
|
Quote:
Quote:
I agree, there are lots of new things happening currently. When it is a bit more finalized then a benching tool can be of use. |
||
1st July 2015, 23:05 | #31480 | Link | |
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
There are tons of reasons why having easily accessible performance data for various option combinations is a very good idea. I suspect it would also help madshi know what is worth optimizing. Last edited by e-t172; 1st July 2015 at 23:09. |
|
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|