Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
12th June 2015, 09:51 | #30961 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
|
CPUs are extremely bad at image processing, it wouldn't be that much of a help. If you wanted to, you could pre-scale the image on the CPU using ffdshow/AviSynth or something, instead of letting madVR do it, but personally I don't think its really worth it.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
12th June 2015, 16:09 | #30963 | Link |
Soul Architect
Join Date: Apr 2014
Posts: 2,559
|
CPU won't do much for this kind of work, but you can combine with SVP which works mostly on the CPU to increase frame rate to 60fps. SVP then runs on the CPU while madVR runs on the GPU. SVP can also downscale the image to your monitor's resolution.
__________________
FrameRateConverter | AvisynthShader | AvsFilterNet | Natural Grounding Player with Yin Media Encoder, 432hz Player, Powerliminals Player and Audio Video Muxer Last edited by MysteryX; 24th June 2015 at 06:01. |
13th June 2015, 00:56 | #30968 | Link | |
Registered User
Join Date: Jun 2010
Posts: 15
|
Quote:
I still don't understand why any under-the-hood timing is relevant when all F/G-Sync care about is what's coming out of the cable port currently. |
|
13th June 2015, 01:18 | #30969 | Link | |
Registered User
Join Date: Jun 2007
Posts: 68
|
Quote:
Let's say its hovering around 10ms per frame, and madVR just sent it "down the cable" with no timing mechanism. Your video will play back at 100 frames per second. This won't be good for a movie shot at 24fps. Would play 4x too fast. Gsync would work well in that, well, the monitor would show a clean frame for those 100 frames in that second, no matter on tiny madVR variations... e.g. one of the frames might go out at 11ms, the other at 9ms. But the movie would be playing back like a Benny Hill intro. Not to mention you will have random judder from the timing jitter. Something has to match the presentation to exactly 24fps. A precise mechanism needs to be in place for that. Gsync does not provide for, or help, that mechanism. Watching video is not a game - you don't want to just push out frames as soon as they are ready. |
|
13th June 2015, 10:52 | #30972 | Link | |
Registered User
Join Date: Jun 2010
Posts: 15
|
Quote:
From what I understood on how G/F-SYNC work, this is how it goes in chronological order, if MadVr is to implement G/F-Sync without making any changes. 1) MadVr checks the display's refresh rate. If it's not a nice multiple of the video fps, it will shift around the frametime of frames (aka pulldown effect). 2) MadVR sends a frame in its buffer. MadVR has a precision timer, so it sends this frame at 60 fps with very small variation. 3) G/F-Sync detects the time taken for the frame to finally reach the end of the GPU pipeline and put it in their buffer with a timestamp 4) G/F-Sync introduces a tiny delay and tells the monitor to change its refresh rate ASAP (there's perhaps a tiny overdriving circuit that forces the display to refresh immediately.. oh hey, it's called the VBLANK) and G/F-Sync does this by looking at the frame history i.e. at how fast it was put into the frame buffer before it was sent to the display. Anything that requires overcompensation, VBLANK it. 5) G/F-Sync finishes its job. So all I'm asking here is that we remove that display's refresh rate check which effectively disables pulldown, and send the video down the GPU pipeline at 48 fps by simply doubling each frame, and let G/F-Sync handle the rest. And they do that by changing the refresh rate of the video on the fly. Now let's see how it goes for games. 1)Games sends out as many frames as it wants no matter what 2) G/F-Sync detects the time taken for the frame to finally reach the end of the GPU pipeline and put it in their buffer with a timestamp 3) G/F-Sync said 'okay, this frame can go, that frame can go into my buffer... or maybe I'll drop it' 4) G/F-Sync detects wild variations in frame rate, and goes 'oh crap, change the refresh rate QUICK QUICK QUICK)... and G/F-Sync does this by looking at the frame history i.e. at how fast it was put into the frame buffer before it was sent to the display. If it requires overcompensation, VBLANK it. 5) G/F-Sync finishes its job. Did I get this right? I don't think madVR is able to bypass the driver layer like DX12/Mantle can. Even then, DX12/mantle is still an abstraction and AMD/Nvidia can still put G/F-Sync at the end of the chain. They still have to go into the driver stack and into the driver's personal queue. Since madVR is unable to bypass this, G/F-Sync can still work. If it works for games, and especially when the variation is very intensive in games, why wouldn't it work for videos? FPS too low? Right now MadVR works by V-SYNCing to the display. Hence you see some frames doing 12 ms, and some frames doing 9 ms to avoid tearing. Disable that v-sync, send that video at 48 FPS by frame duplication, give not a single crap about tearing, let F/G-Sync handle the rest. I think the main take away is that, G/F-Sync allows variable refresh rate. What does that mean? You make sure the GPU outputs 48 FPS, G/F-Sync will then set the display to be at 48 Hz. Why do I keep mentioning 48 FPS? It's double the magic video fps of 24 fps, meaning that you can just double the frames. Now tell me why wouldn't simply shoving a doubled 24 fps video at 48 fps and then letting G/F-Sync auto-set the display to 48 Hz work? Pardon my tone. Also, this is why I wanted a more serious discussion on G/F-Sync in this thread. All the posts before this hasn't been as detailed as to what I posted. Even this one is perhaps not detailed enough. I didn't go into the greater details on why there's no tearing but the keywords are VBLANK and variable refresh rate. Variable refresh rate allows the monitor's refresh rate to be set to anything between 40~60 fps (IPS monitors are supporting FreeSync now!). VBLANK for forcing a refresh to compensate for any sudden changes. I realize that there's a ton in this text of me simply screaming variable refresh rate and VBLANK in a hundred different ways. That's on purpose. I'm trying to get the idea across that all you have to do with G/F-Sync, is assume that monitor is a 48 Hz monitor, convert the video from 24 fps to 48 fps by duplicating the frames, and then G/F-Sync will change the monitor refresh rate to 48 Hz to suit your video. I may have also been contradictory on some parts. If you would please point that out, I'll be glad. I'm kinda bad at presenting my ideas. Last edited by blindbox; 13th June 2015 at 11:35. |
|
13th June 2015, 10:57 | #30973 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
|
madshi has explicitly asked for discussions around G-Sync/FreeSync to not be happening in this thread anymore at this time, so I suggest you respect his wishes, and if you want to continue discussing it, open your own thread.
All the technical points of why its more complicated than you seem to believe have already been discussed to death, so i suggest you actually read them, too.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
13th June 2015, 11:23 | #30974 | Link | ||||||
Registered Developer
Join Date: Sep 2006
Posts: 9,140
|
Quote:
What happens if you play a GPU heavy game? Does that also up the DPC latencies in a similar way? Quote:
Quote:
One question: With debanding there were often some thresholds where increasing one parameter by 0.1 didn't do much, but increasing 0.2 suddenly brought a big change. I think this is not the case with FineSharp, correct? A small change in the FineSharp parameters causes a small change in image quality, right? I suppose if I decided to use 1.4/2.0/2.4 that wouldn't bother you much? Or is there something "magic" about your choices of 1.5/2.0/2.3? Not that I'm planning to use 1.4/2.0/2.4, just asking how "fixed" your suggestions are. Quote:
Quote:
Quote:
------- Based on FineSharp feedback so far I've decided to always enable "linear light", use "mode 3" and set "repair" to the max value of 1.0 (unless several users come with objections). I'd still like to get more feedback from more users on the "strength" and "thinning" options. Which combination of those settings would you guys suggest for "low", "medium" and "high" presets? Thanks! |
||||||
13th June 2015, 11:39 | #30975 | Link | ||
Registered User
Join Date: Jun 2010
Posts: 15
|
Quote:
I understand. If the reason why he's not doing it is because it's DisplayPort only (FreeSync is coming to HDMI, btw), then I won't pursue this further. I first thought the reason why F/G-Sync is not considered is because of the lack of understanding of the methodology of both technologies (that is, by changing the refresh rate of the display on the fly, and making sure it matches the frametimes by some wizardy). It's just that I thought having the capability of setting my refresh rate to whatever I want with whatever fps the video is playing at, is a match made in heaven. Again, I have said, I have read every post on F/G-Sync on this thread. None of it answers my query. I took away an edit of mine when I said Asmodian tested G-Sync and it didn't work as expected because madVR is detecting the wrong Hz (at first I understood it as 48 Hz G-Sync as different compared to a 48 Hz monitor). Then he also said something similar to what I said - the fact that madVr tries to adjust the frametime of each frame based on the current refresh rate is screwing G-Sync up. What if madVr simply assumes that the monitor's Hz is 48? Asmodian asked the same thing but never got his reply. Quote:
That's all from me. Not supporting it because it only supports Displayport and a small subset of monitors is a valid reason. I hope that's the reason, instead of a misunderstanding of the underlying tech. Last edited by blindbox; 13th June 2015 at 11:56. |
||
13th June 2015, 13:10 | #30976 | Link | |
Registered User
Join Date: Jan 2008
Posts: 589
|
Quote:
Also, you seem to be assuming that G-Sync/FreeSync works by the GPU gauging the frame rate over some period of time and switching the screen to a matching refresh rate. That's not how it works: when using these technologies there is no such thing as a "refresh rate" anymore, frames are simply sent to the screen at unspecified times whenever the system feels like it. To reiterate my previous explanation: when using a fixed refresh rate, the refresh rate clock is a hardware clock. madVR simply makes sure the GPU has a pipeline of frames to display (through "present frames in advance" mechanisms), and the hardware clock on the GPU makes sure these frames are sent to the display with a precise, regular cadence according to the fixed refresh rate. When using variable refresh rate, there is no hardware clock anymore, the frames can't be presented in advance and have to be timed in software. This is very difficult to do in an accurate manner because Windows is not a real-time OS and therefore doesn't guarantee the madVR presentation thread will run at the exact time required to accurately present the next frame. Therefore, such a solution would probably have too much presentation (= refresh rate) jitter to be viable for high-quality video playback. Actually, that's not true anymore. Last edited by e-t172; 13th June 2015 at 13:16. |
|
13th June 2015, 13:58 | #30977 | Link | |
Soul Seeker
Join Date: Sep 2013
Posts: 711
|
Quote:
|
|
13th June 2015, 14:19 | #30978 | Link |
Registered User
Join Date: Jun 2015
Posts: 1
|
Let me first thank you for the utterly awesome tool that is madVR. I've been considering a Lumagen Radiance VP but as far as I can tell, the "only" things that a (very expensive, and currently only 2K in) Radiance would bring that madVR does not - for my use cases - is HDMI switching (obviously) and aspect ratio/crop/zoom/vertical shift for CIH/CIW setups.
Are there any plans to add aspect ratio/crop/zoom/vertical shift to madVR? I realize that some players already offer some combination of these, but rarely all. I would be all over developing a companion application which would store various profiles (combinations of aspect ratio/crop/zoom/shift) for madVR and expose these over a REST API for IP based remote control (Roomie Remote, iRule, etc.). |
13th June 2015, 14:52 | #30979 | Link | |
Registered Developer
Join Date: Sep 2006
Posts: 9,140
|
Quote:
|
|
13th June 2015, 15:04 | #30980 | Link |
Registered Developer
Join Date: Mar 2010
Location: Hamburg/Germany
Posts: 10,342
|
Sounds like you're doing it wrong.
__________________
LAV Filters - open source ffmpeg based media splitter and decoders |
Tags |
direct compute, dithering, error diffusion, madvr, ngu, nnedi3, quality, renderer, scaling, uhd upscaling, upsampling |
Thread Tools | Search this Thread |
Display Modes | |
|
|