Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion. Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules. |
12th February 2022, 11:31 | #1 | Link |
Registered User
Join Date: Jan 2022
Location: France
Posts: 21
|
Looking to make no compromise motion blur
It's been years since I see people recording in high framerates on easy to run games (Minecraft, Quake/Doom games, or even CS:GO/Overwatch on higher end PCs) and using frame blending (aka smart resampling for vegas pro /ffmpeg tmix) to render down the fps back to something you can upload to YouTube (60/30FPS), here's an example
The higher the amount of blended frames (input fps / output fps e.g 240->60), the less you can notice it and the closer to "real life motion blur" it looks: A year ago I discovered RIFE (frame interpolation based on machine learning) and used it with (Flowframes) to interpolate past the thousand FPS to see how it would look, and even though made some decent results it was extremely time consuming (as it was exponentially slower) and resource heavy Much later I've came across blur, an utility to make smooth videos using VapourSynth which builds up a vpy script from a parsed config file. Using this was much faster and convenient since high FPS content did not need the accuracy of RIFE (SVPFlow is much faster) and it also can frame blend much faster than Vegas' Smart Resample / Adobe's frame blending / FFmpeg's single threaded tmix It ends up being an user-friendly way to interpolate using InterFrame and frame blend using vs-frameblender, f0e (blur dev)'s modified version of AverageFrames here's an example of the kind of scripts that it builds and runs: Code:
from vapoursynth import core import vapoursynth as vs import havsfunc as haf import adjust import weighting import filldrops video = core.ffms2.Source(source="D:/Videos/example.mp4", cache=False) video = filldrops.FillDrops(video, thresh=0.001) # Finds and replaces duplicated frames with interpolation (e.g encoding lag) video = haf.InterFrame(video, GPU=True, NewNum=360, Preset="medium", Tuning="weak", OverrideAlgo=23) frame_gap = int(video.fps / 60) blended_frames = int(frame_gap * 1.0) if blended_frames > 0: if blended_frames % 2 == 0: blended_frames += 1 # Gets the right amount of weights to give to the frameblender weights = weighting.equal(blended_frames) # Makes an array of each blended frame's importance, each of them being equal here video = core.frameblender.FrameBlend(video, weights, True) video = haf.ChangeFPS(video, 60) video.set_output() Since then I've been looking for better ways to interpolate/frame blend using VapourSynth: Selur's Hybrid has been really helpful since it can build VapourSynth scripts with multiple different ways to interpolate (FrameRateConverter/InterFrame/RIFE/MvToolsFPS) for me to study/test how each compare A friend recently told me about combining Pixel Motion and CC Force motion blur in After Effects, which gave life-like blur at the cost of (RIFE-like) slow render times and lots of artifacts (e.g when flicking in a FPS shooter, the whole GUI would blur along). This creates blur similar to RSMB while being able to fine-tune how much blended frames to create All I wrote was to explain what I've tried and what I'm trying to achieve, do you experienced folks know other VS plugins/scripts (or AVS+, I could make the switch if it ends up being what I'm looking for) to create blurred frames from gameplay footage? As of now I'm staying on vs r54 because vs-frameblender seems to be much slower on newer versions for some reason, there's probably other ways to blend a single video's frames (haven't tried reproducing similar results with vs-average) ps: also sad to see HolyWu (interframe vs porter) deleted their account |
Tags |
averageframes, blending, frame interpolation, interpolation, motion blur |
Thread Tools | Search this Thread |
Display Modes | |
|
|