Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Hardware & Software > Software players

Reply
 
Thread Tools Search this Thread Display Modes
Old 27th November 2016, 08:07   #40761  |  Link
Ver Greeneyes
Registered User
 
Join Date: May 2012
Posts: 445
Quote:
Originally Posted by Uoppi View Post
Even if you couldn't see those differences at all under normal viewing conditions (like I don't think I can for chroma, for example), just knowing you're using the "worst" settings may degrade the experience. If your GPU can handle it, why not use a decent algorithm "just in case"?

Because placebo is a very real and powerful phenomenon*, I see no reason why using "recommended best" settings wouldn't be a valid and acceptable way to proceed.
In addition, what if you test each algorithm individually and can't see a difference from a distance, but it turns out that adding them all together is just enough to make the difference visible? I'd rather be on the safe side and assume that it might make a difference even if I have to get up close to make out differences between individual algorithms. Besides, even if you can't consciously put your finger on what the difference is, a clearer image might reduce eyestrain or something.

Last edited by Ver Greeneyes; 27th November 2016 at 08:09.
Ver Greeneyes is offline   Reply With Quote
Old 27th November 2016, 10:03   #40762  |  Link
Uoppi
Registered User
 
Join Date: Oct 2015
Posts: 99
Quote:
Originally Posted by Ver Greeneyes View Post
In addition, what if you test each algorithm individually and can't see a difference from a distance, but it turns out that adding them all together is just enough to make the difference visible? I'd rather be on the safe side and assume that it might make a difference even if I have to get up close to make out differences between individual algorithms. Besides, even if you can't consciously put your finger on what the difference is, a clearer image might reduce eyestrain or something.
Very good point. I can immediately see madvr improves the overall PQ (less aliasing and ringing being the most obvious), so all is certainy not placebo and I wouldn't pay for a dedicated GPU just for placebo anyway.

My "problem" atm is it's difficult to find sufficient time for testing when you're required to be changing diapers at the same time, lol. So any starting point recommendations are always welcome and this forum has been invaluable.
Uoppi is offline   Reply With Quote
Old 27th November 2016, 10:26   #40763  |  Link
ryrynz
Registered User
 
ryrynz's Avatar
 
Join Date: Mar 2009
Posts: 3,229
Quote:
Originally Posted by Growdelan View Post
SSIM 1D100% + AR, LL, AB 100% - did not help, lips and eyes sharper on super-xbr + lanczos3.
I'd try disabling AB for starters and then as above, disable SE if it's enabled.
ryrynz is offline   Reply With Quote
Old 27th November 2016, 11:06   #40764  |  Link
Growdelan
Registered User
 
Join Date: Nov 2016
Posts: 5
Quote:
Originally Posted by Backflash View Post
Disable soften edges, add grain 2.
Quote:
Originally Posted by ryrynz View Post
I'd try disabling AB for starters and then as above, disable SE if it's enabled.
Ok, I turned off SE, and turned on AG 2. I think it work

http://screenshotcomparison.com/comparison/191961

Growdelan is offline   Reply With Quote
Old 27th November 2016, 11:47   #40765  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 9
Any chance to get g-sync working in windowed mode? Gsync officially supports windowed mode applications so it should work with MadVR too.



It is really disappointing to spend $1k on a monitor and not be able to take advantage of this feature.

Note: On my system, Gsync does work in full screen exclusive mode, but only with DX11 disabled under Rendering - General Settings. But it doesn't work for all video files. Strangely it does work for all youtube videos I've tested so far (piped by SVPTube).
flossy_cake is offline   Reply With Quote
Old 27th November 2016, 12:14   #40766  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,971
g-sync is a banned topic here.

madVR aims at HTPCs not gaming devices.
huhn is offline   Reply With Quote
Old 27th November 2016, 12:35   #40767  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 9
MadVR aims at the best possible rendering performance. If you look at the extent Madshi has gone to increasing the picture quality, how many countless hours must have been spent pursuing image fidelity through all of the features implementation that currently exist, it's a no brainer that gsync compatibility would be a desirable feature to support. Odd you would consider refresh rate matching some kind of "gaming" gimmick. No different to using 24hz mode on a HDTV or the refresh rate matching listview in MPC-HC. The goal is the same, the implementation is just more intuitive with gsync because it doesn't require anything extra than rendering in a particular D3D mode.

Last edited by flossy_cake; 27th November 2016 at 12:37.
flossy_cake is offline   Reply With Quote
Old 27th November 2016, 12:40   #40768  |  Link
huhn
Registered User
 
Join Date: Oct 2012
Posts: 5,971
http://forum.doom9.org/showpost.php?...ostcount=29243
huhn is offline   Reply With Quote
Old 27th November 2016, 12:44   #40769  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,949
Quote:
Originally Posted by flossy_cake View Post
Any chance to get g-sync working in windowed mode? Gsync officially supports windowed mode applications so it should work with MadVR too.



It is really disappointing to spend $1k on a monitor and not be able to take advantage of this feature.

Note: On my system, Gsync does work in full screen exclusive mode, but only with DX11 disabled under Rendering - General Settings. But it doesn't work for all video files. Strangely it does work for all youtube videos I've tested so far (piped by SVPTube).
So only because your ego spent 1K on it you think Markets will drift for you, that's not how they work

Also the whole stuff is so much in Research State at all and so much in flux that it's a pain to make decisions based on that, that tomorrow could go invalid.

I can understand Madshi very good not playing this "Game" *g* a rather sane decision on a developer resource side.
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 27th November 2016 at 13:00.
CruNcher is offline   Reply With Quote
Old 27th November 2016, 12:52   #40770  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 9
Wow you guys really have a passionate hatred for gsync! Quite surprised to find this in a place supposedly dedicated to video fidelity. I'm guessing you've never actually owned one of these monitors and don't really have any experience with it and/or realise the benefits. Strange, oh well.

Managing frame pacing should not be a problem - multiple frames can be pre-rendered in advance in D3D mode and the monitor matches the render queue not windows 16.7ms time slices. Remember we are in D3D mode, otherwise all 3D mode graphics would snap to the nearest 16.7ms and varisync would not work. With varisync you can fluctuate rapidly between multiple framerates and it is seamless so it doesn't actually matter if the frame pacing is a bit off, that's what makes it so good.

Besides the frame pacing is perfect in full screen mode already so it's probably 2 lines of code to enable the same D3D mode for windowed. But I see Madshi hates it so it'll never happen, oh well.

Last edited by flossy_cake; 27th November 2016 at 12:55.
flossy_cake is offline   Reply With Quote
Old 27th November 2016, 12:56   #40771  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,835
This was discussed several times before, and the core rendering logic of madVR is not compatible with G-SYNC (or FreeSync for that matter), since it relies on constant V-SYNC intervals for timing the video frames. With G-SYNC it would have to do its own timing, as V-SYNC doesn't exist anymore. Its not like a game where it can just render more or less frames depending on how fast everything runs.
Feel free to search the topic, it has been discussed several times and madshi has commented several times as well that such modes are not planned to be supported anytime soon, as it would require drastic changes to the rendering logic.

V-SYNC has key advantages to video rendering because its a hardware driven interrupt system, no need for inaccurate timers and hope they are correct.

huhn already linked a reply from madshi in an earlier post, so refer to that for the word from madVRs author:
http://forum.doom9.org/showpost.php?...ostcount=29243

(Yes we're aware that by now G-SYNC works through HDMI, but TVs still don't exist that do that)
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 27th November 2016 at 13:01.
nevcairiel is offline   Reply With Quote
Old 27th November 2016, 12:58   #40772  |  Link
XTrojan
Registered User
 
Join Date: Oct 2015
Posts: 88
G-Sync has no benefit in MadVR buddy.
Movies are delivered in a static framerate, V-Sync handles this exactly like G-Sync would in the same situation.
The only difference is if you would drop frames, that's when G-Sync would improve things a little, but who watches movies with dropped frames?
XTrojan is offline   Reply With Quote
Old 27th November 2016, 13:08   #40773  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,949
XTrojan that's the thing and point nvidia enhances render stability of Windows by shifting the whole issue from the OS to their driver taking control
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 27th November 2016 at 13:11.
CruNcher is offline   Reply With Quote
Old 27th November 2016, 13:13   #40774  |  Link
e-t172
Registered User
 
Join Date: Jan 2008
Posts: 568
Quote:
Originally Posted by XTrojan View Post
G-Sync has no benefit in MadVR buddy.
Movies are delivered in a static framerate, V-Sync handles this exactly like G-Sync would in the same situation.
The only difference is if you would drop frames, that's when G-Sync would improve things a little, but who watches movies with dropped frames?
I disagree. G-Sync, at least in theory, makes it possible to completely slave the video clock to the audio clock, thereby getting rid of frame drops/repeats that are caused by slight mismatch between the two clocks. Which solves an actual problem that affects everyone in this thread, because it's a fundamental problem of video playback on PC in general. The only ways to "solve" that problem today are (1) use ReClock, (2) use Smooth Motion or (3) spend hours fiddling with custom refresh rate timings to try to get it closer to the audio clock. All three "solutions" are brittle hacks that come with significant tradeoffs. G-Sync could be a proper, clean solution to that problem.

However, as madshi already pointed out in the past, one of the main obstacles to using G-Sync to improve video playback is that the set of video displays that support G-Sync (typically, gaming monitors) and the set of video displays that are typically used for video playback (HDTVs, projectors) do not overlap. In other words G-Sync support would only benefit people who critically watch videos on a gaming monitor, which is a very small niche, even among madVR users (which is already a small niche).

There was some excitement in the past about FreeSync being part of new HDMI specs, and one could speculate that it would make sense to support variable sync in HDTVs because game consoles could use it, but despite that, I have never seen an HDTV with variable sync support, nor have I seen any manufacturer interest in it. Maybe after the dust settles around HDR, one could hope…
e-t172 is offline   Reply With Quote
Old 27th November 2016, 13:16   #40775  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,949
Quote:
There was some excitement in the past about FreeSync being part of new HDMI specs, but despite that, and one could speculate that it would make sense to support variable sync in HDTVs because game consoles could use it, but despite that, I have never seen an HDTV with variable sync support, nor have I seen any manufacturer interest in it. Maybe after the dust settles around HDR, one could hope…
Right you have all the advanced FRC stuff their and HDR and 10 bit have top priority, when VFR gets more mature in the Future and a definite Standard decided everyone benefits from sure we will see a massive adoption but that's in the shelves for the next TV releases

Also Microsoft has other priorities getting HDR and 10 bit by default into Windows now and VFR is absolutely left to the IHVs for now battling out a mature approach.
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 27th November 2016 at 13:22.
CruNcher is offline   Reply With Quote
Old 27th November 2016, 13:23   #40776  |  Link
flossy_cake
Registered User
 
Join Date: Aug 2016
Posts: 9
I fully accept that gsync will not be getting support and will not be asking for it any further, but I want to correct some misinformation as it really annoys me:

Quote:
Originally Posted by nevcairiel View Post
the core rendering logic of madVR is not compatible with G-SYNC (or FreeSync for that matter), since it relies on constant V-SYNC intervals for timing the video frames.
It currently works in full screen exclusive mode.

Quote:
Originally Posted by nevcairiel View Post
Its not like a game where it can just render more or less frames depending on how fast everything runs.
It doesn't matter if a frames arrive a few milliseconds late/early because it's no longer having to snap to the nearest 16.7ms (for a 60hz monitor). So your frames can arrive like this: 40ms-35ms-45ms-42ms-38ms etc. and it looks seamless because the monitor is matching this pattern exactly (see the testufo demo on previous page). That's what's so great about it, it solves this need to match to any refresh rate, you no longer have to worry about frame pacing.


Quote:
Originally Posted by nevcairiel View Post
V-SYNC has key advantages to video rendering because its a hardware driven interrupt system, no need for inaccurate timers and hope they are correct.
Look up hi res timer libraries. I'm pretty sure the RivaTuner D3D limiter uses one as it is very accurate (sort of the "gold standard" of limiters) and certainly isn't limited to 16.7ms slices.

https://msdn.microsoft.com/en-us/lib...(v=vs.85).aspx

Quote:
Originally Posted by XTrojan View Post
G-Sync has no benefit in MadVR buddy.Movies are delivered in a static framerate, V-Sync handles this exactly like G-Sync would in the same situation.
I suppose I could set it up for vsync fixed refresh mode and have MPC-HC automatically switch modes per file, but that is a lot of mode switching with youtube videos
flossy_cake is offline   Reply With Quote
Old 27th November 2016, 13:24   #40777  |  Link
x7007
Registered User
 
Join Date: Apr 2013
Posts: 251
Issue when using Exclusive mode Dx11 with 60 Hz

Using many players when I use 59 Hz on my TV Philips 7007 the exclusive black flash is instant when going in and out of the video.

When using 60 Hz the black flash is almost 2 sec for no reason. The video still shows 60 Hz on the Display and Composition rate, it doesn't seem to change refresh rate while entering exclusive mode.


Does anyone have any idea how to fix it ? it's annoying to go out and in the video when it's 2 sec instead instant.
x7007 is offline   Reply With Quote
Old 27th November 2016, 13:30   #40778  |  Link
nevcairiel
Registered Developer
 
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 9,835
Quote:
Originally Posted by flossy_cake View Post
It doesn't matter if a frames arrive a few milliseconds late/early because it's no longer having to snap to the nearest 16.7ms (for a 60hz monitor). So your frames can arrive like this: 40ms-35ms-45ms-42ms-38ms etc. and it looks seamless because the monitor is matching this pattern exactly (see the testufo demo on previous page). That's what's so great about it, it solves this need to match to any refresh rate, you no longer have to worry about frame pacing.
Aren't you the one that wants smooth motion by using G-SYNC? and then you advocate for inaccurate timers? That gives you micro-judder everywhere. How is that any good?
Either it has to work perfectly, or its not worth using.

Quote:
Originally Posted by flossy_cake View Post
Look up hi res timer libraries.
Windows is not a real-time OS, even with a infinite resolution timer there are no guarantees that you can actually act in that exact moment. VSYNC on ther other hand is a hardware interrupt, it has all the guarantees.

Considering 99.9% of all content is fixed refresh rate, just switching your screen refresh is a much easier solution.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders

Last edited by nevcairiel; 27th November 2016 at 13:33.
nevcairiel is offline   Reply With Quote
Old 27th November 2016, 13:33   #40779  |  Link
CruNcher
Registered User
 
CruNcher's Avatar
 
Join Date: Apr 2002
Location: Germany
Posts: 4,949
Quote:
Originally Posted by flossy_cake View Post
I fully accept that gsync will not be getting support and will not be asking for it any further, but I want to correct some misinformation as it really annoys me:



It currently works in full screen exclusive mode.



It doesn't matter if a frames arrive a few milliseconds late/early because it's no longer having to snap to the nearest 16.7ms (for a 60hz monitor). So your frames can arrive like this: 40ms-35ms-45ms-42ms-38ms etc. and it looks seamless because the monitor is matching this pattern exactly (see the testufo demo on previous page). That's what's so great about it, it solves this need to match to any refresh rate, you no longer have to worry about frame pacing.




Look up hi res timer libraries. I'm pretty sure the RivaTuner D3D limiter uses one as it is very accurate (sort of the "gold standard" of limiters) and certainly isn't limited to 16.7ms slices.

https://msdn.microsoft.com/en-us/lib...(v=vs.85).aspx



I suppose I could set it up for vsync fixed refresh mode and have MPC-HC automatically switch modes per file, but that is a lot of mode switching with youtube videos
Yes you seem to have understood it then i guess you also understand that it hasn't anything todo with Performance but only the delivery of the result and that you will percept this higher latency still as a slowdown a smoother slow down but still a slow down

A developers goal should be allways to deliver Performance to alleviate this clean code perfect timing.

You as a customer now pay for that a currently to high Premium price, but you can't really expect every Dev to change his mind about that unless maybe they get a share of that money from Nvidia to implement and support it from the money you spend on it

And you can be sure not everyone will agree that currently this is the right path to go longterm improving the situation instead for the masses only for a small paying brainwashed portion of it.
__________________
all my compares are riddles so please try to decipher them yourselves :)

It is about Time

Join the Revolution NOW before it is to Late !

http://forum.doom9.org/showthread.php?t=168004

Last edited by CruNcher; 27th November 2016 at 13:55.
CruNcher is offline   Reply With Quote
Old 27th November 2016, 13:41   #40780  |  Link
Backflash
Registered User
 
Join Date: Jan 2016
Posts: 52
Quote:
Originally Posted by e-t172 View Post
In other words G-Sync support would only benefit people who critically watch videos on a gaming monitor, which is a very small niche, even among madVR users (which is already a small niche).
I wouldn't be so sure, how do I put this, if someone bothered to learn about madvr then went through settings learning curve(which is overwhelming at first) and has semi decent GPU to run it all I would say such person is watching at it critically enough.
Also people that go through all that usually have several hobbies and get good monitors/tvs(fun fact: recent asus gaming IPS monitors have better color quality than most 60hz IPS models, according to tft central).
Then there is social factor: anime, tv, videogames at the same time attract special kind of crowd. I for example do everything on my MG279q and have no wish for HTPC.
Just a thought, I may be 180' wrong on this.
Backflash is offline   Reply With Quote
Reply

Tags
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:29.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2019, vBulletin Solutions Inc.