Welcome to Doom9's Forum, THE in-place to be for everyone interested in DVD conversion.

Before you start posting please read the forum rules. By posting to this forum you agree to abide by the rules.

 

Go Back   Doom9's Forum > Capturing and Editing Video > VapourSynth

Reply
 
Thread Tools Search this Thread Display Modes
Old 12th February 2022, 11:31   #1  |  Link
Couleur
Registered User
 
Couleur's Avatar
 
Join Date: Jan 2022
Location: France
Posts: 21
Looking to make no compromise motion blur

It's been years since I see people recording in high framerates on easy to run games (Minecraft, Quake/Doom games, or even CS:GO/Overwatch on higher end PCs) and using frame blending (aka smart resampling for vegas pro /ffmpeg tmix) to render down the fps back to something you can upload to YouTube (60/30FPS), here's an example

The higher the amount of blended frames (input fps / output fps e.g 240->60), the less you can notice it and the closer to "real life motion blur" it looks:


A year ago I discovered RIFE (frame interpolation based on machine learning) and used it with (Flowframes) to interpolate past the thousand FPS to see how it would look, and even though made some decent results it was extremely time consuming (as it was exponentially slower) and resource heavy

Much later I've came across blur, an utility to make smooth videos using VapourSynth which builds up a vpy script from a parsed config file. Using this was much faster and convenient since high FPS content did not need the accuracy of RIFE (SVPFlow is much faster) and it also can frame blend much faster than Vegas' Smart Resample / Adobe's frame blending / FFmpeg's single threaded tmix

It ends up being an user-friendly way to interpolate using InterFrame and frame blend using vs-frameblender, f0e (blur dev)'s modified version of AverageFrames
here's an example of the kind of scripts that it builds and runs:

Code:
from vapoursynth import core
import vapoursynth as vs
import havsfunc as haf
import adjust
import weighting
import filldrops
video = core.ffms2.Source(source="D:/Videos/example.mp4", cache=False)
video = filldrops.FillDrops(video, thresh=0.001) # Finds and replaces duplicated frames with interpolation (e.g encoding lag)
video = haf.InterFrame(video, GPU=True, NewNum=360, Preset="medium", Tuning="weak", OverrideAlgo=23)
frame_gap = int(video.fps / 60)
blended_frames = int(frame_gap * 1.0)
if blended_frames > 0:
	if blended_frames % 2 == 0:
		blended_frames += 1 # Gets the right amount of weights to give to the frameblender
	weights = weighting.equal(blended_frames) # Makes an array of each blended frame's importance, each of them being equal here
	video = core.frameblender.FrameBlend(video, weights, True)
video = haf.ChangeFPS(video, 60)
video.set_output()
Since there's been no activity since last October I've decided to learn and dive into Python/VapourSynth scripting, and ended up making a fork with a much simpler approach that uses a single .vpy script and uses VSPipe's --arg to specify input videos and make it parse a simple config file in a similar manner (directly inside of .vpy script instead of writing down a hardcoded temporary file each time)

Since then I've been looking for better ways to interpolate/frame blend using VapourSynth:

Selur's Hybrid has been really helpful since it can build VapourSynth scripts with multiple different ways to interpolate (FrameRateConverter/InterFrame/RIFE/MvToolsFPS) for me to study/test how each compare
A friend recently told me about combining Pixel Motion and CC Force motion blur in After Effects, which gave life-like blur at the cost of (RIFE-like) slow render times and lots of artifacts (e.g when flicking in a FPS shooter, the whole GUI would blur along). This creates blur similar to RSMB while being able to fine-tune how much blended frames to create

All I wrote was to explain what I've tried and what I'm trying to achieve, do you experienced folks know other VS plugins/scripts (or AVS+, I could make the switch if it ends up being what I'm looking for) to create blurred frames from gameplay footage?

As of now I'm staying on vs r54 because vs-frameblender seems to be much slower on newer versions for some reason, there's probably other ways to blend a single video's frames (haven't tried reproducing similar results with vs-average)

ps: also sad to see HolyWu (interframe vs porter) deleted their account
Couleur is offline   Reply With Quote
Old 14th February 2022, 22:43   #2  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,424
side note: not sure whether it helps, but the filldrops.py from https://github.com/Selur/VapoursynthScriptsInHybrid allows to use SVP and RIFE as alternative to MVTools for the interpolation of duplicate frames.
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 15th February 2022, 12:38   #3  |  Link
Couleur
Registered User
 
Couleur's Avatar
 
Join Date: Jan 2022
Location: France
Posts: 21
Quote:
Originally Posted by Selur View Post
side note: not sure whether it helps, but the filldrops.py from https://github.com/Selur/VapoursynthScriptsInHybrid allows to use SVP and RIFE as alternative to MVTools for the interpolation of duplicate frames.
Yup, looks really useful for when clips have a little bit of encoding lag / freezes in-game
Couleur is offline   Reply With Quote
Old 15th February 2022, 13:01   #4  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,424
btw. can you share a small/short clip with such game content (without the motion blur)?
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 2nd March 2022, 14:21   #5  |  Link
Couleur
Registered User
 
Couleur's Avatar
 
Join Date: Jan 2022
Location: France
Posts: 21
Here's a 1080p clip recorded in 280FPS you can mess with

Here's an example of what I'd do with this clip, the process it went through:
Interpolated to 1920FPS (I know it's overkill, I don't like seeing resampling artifacts) using SVP
Use vs-frameblender (very similar to AverageFrames) with 50% more weights to add ghosting on every frames then chop off the FPS to 60.
Piped to FFmpeg to encode in libx264 so it an embed on fileditch

^ That's inefficient since you do a heck of a lot of blending that just gets trashed, that's why I'm looking for something that can average multiple frames together but only process a single frame. Example: only blending one out of 4 frames when resampling 240FPS to 60. Looking into this rn


PS: You don't have to do such stupidly high interpolation, this has none and will look fine to most people

Last edited by Couleur; 2nd March 2022 at 14:23. Reason: wrong link
Couleur is offline   Reply With Quote
Old 2nd March 2022, 18:22   #6  |  Link
StainlessS
HeartlessS Usurer
 
StainlessS's Avatar
 
Join Date: Dec 2009
Location: Over the rainbow
Posts: 10,992
Quote:
Originally Posted by Couleur View Post
Looking into this rn
That link refers to ClipBlend, and RgbAmplifier [amongst other filters],
Thought I'de point out that Wonkey_Monkey did an RgbAmplifier alternative working also in YUV [RgbAmplifier RGB only].
Unfortunately Wonkey Donkey link in Avisynth Usage / New plugins thread dont work(404), but available here on his web page.
https://horman.net/avisynth/
Top link, AMP.

EDIT: Wonkey post telling of existence of AMP plug,
Quote:
Originally Posted by wonkey_monkey View Post
I had no luck getting RgbAmplifier to work - it just ran out of memory - so I implemented my understanding of the process in a plugin:

http://forum.doom9.org/showthread.ph...32#post1703332
He had lost interest before script turned into CPP plugin, so did not know that it worked just fine.

Wonkey,
Quote:
Mine works in YV12 and YUY2, though.
__________________
I sometimes post sober.
StainlessS@MediaFire ::: AND/OR ::: StainlessS@SendSpace

"Some infinities are bigger than other infinities", but how many of them are infinitely bigger ???

Last edited by StainlessS; 2nd March 2022 at 18:39.
StainlessS is offline   Reply With Quote
Old 2nd March 2022, 19:33   #7  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,424
for the fun of it, I did a small test:
Code:
# Imports
import vapoursynth as vs
# getting Vapoursynth core
core = vs.core
# Loading Plugins
core.std.LoadPlugin(path="I:/Hybrid/64bit/vsfilters/SourceFilter/LSmashSource/vslsmashsource.dll")
# defining beforeDeCross-function - START
def beforeDeCross(clip):
  core.std.LoadPlugin(path="I:/Hybrid/64bit/vsfilters/Support/vs_average.dll")
  
  sourceFps = clip. fps
  divider = 5
  targetFPS  = clip.fps/divider
  
  #convert to YUV420P16
  clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P16, range_s="limited")
  num = clip.num_frames;
  
  out = clip[0]
  count = int(num/divider)
  for i in range(count):
    clips = [None]*divider
    for j in range(divider):
      clips[j]=clip[i+j]
    i = i + divider;
    out =  out + core.average.Mean(clips)
  clip = out
  # colorformat YUV420P16
  # framerate 56
  return clip
# defining beforeDeCross-function - END

# source: 'C:\Users\Selur\Desktop\OMbBiTFQnwnFrNeBAbwb.mkv'
# current color space: YUV420P8, bit depth: 8, resolution: 1920x1080, fps: 280, color matrix: 709, yuv luminance scale: limited, scanorder: progressive
# Loading C:\Users\Selur\Desktop\OMbBiTFQnwnFrNeBAbwb.mkv using LWLibavSource
clip = core.lsmas.LWLibavSource(source="C:/Users/Selur/Desktop/OMbBiTFQnwnFrNeBAbwb.mkv", format="YUV420P8", cache=0, prefer_hw=0)
# Setting color matrix to 709.
clip = core.std.SetFrameProps(clip, _Matrix=1)
clip = clip if not core.text.FrameProps(clip,'_Transfer') else core.std.SetFrameProps(clip, _Transfer=1)
clip = clip if not core.text.FrameProps(clip,'_Primaries') else core.std.SetFrameProps(clip, _Primaries=1)
# Setting color range to TV (limited) range.
clip = core.std.SetFrameProp(clip=clip, prop="_ColorRange", intval=1)
# making sure frame rate is set to 280
clip = core.std.AssumeFPS(clip=clip, fpsnum=280, fpsden=1)
clip = beforeDeCross(clip)
# current meta; color space: YUV420P16, bit depth: 16, resolution: 1920x1080, fps: 56, color matrix: 709, yuv luminance scale: limited, scanorder: progressive
# adjusting output color from: YUV420P16 to YUV420P10 for x265Model
clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P10, range_s="limited")
# set output frame rate to 56.000fps
clip = core.std.AssumeFPS(clip=clip, fpsnum=56, fpsden=1)
# Output
clip.set_output()
-> 56fps.mkv
]
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Old 2nd March 2022, 20:14   #8  |  Link
richardpl
Guest
 
Posts: n/a
I made tmix filter even faster than it was before.
  Reply With Quote
Old 2nd March 2022, 22:11   #9  |  Link
Couleur
Registered User
 
Couleur's Avatar
 
Join Date: Jan 2022
Location: France
Posts: 21
Quote:
Originally Posted by StainlessS View Post
Wonkey_Monkey did an RgbAmplifier alternative working also in YUV
I'm still confused why color range/spaces/bits/whatever is involved so I thought it was related to something else
Couleur is offline   Reply With Quote
Old 2nd March 2022, 23:01   #10  |  Link
StainlessS
HeartlessS Usurer
 
StainlessS's Avatar
 
Join Date: Dec 2009
Location: Over the rainbow
Posts: 10,992
Quote:
I thought it was related to something else
Because of this,
Quote:
RgbAmplifier (RGB24 and RGB32), requires VS2008 CPP Runtimes.
========================================
An Avisynth plugin to amplify color shifts

Given a clip, this filter examines every pixel of every frame and independently multiplies the difference of its specific
R, G and B values from the average R, G and B values of that same pixel location spanning a defined radius of adjacent frames.
If the new values are above or below the allowed RGB values, they are capped at those limits.
The revised RGB values replace the original values, and the amplified clip is returned.
If the Multiplier is set to a value of zero, the plugin acts as a temporal frame averager.
EDIT: I presume that Wonkey_Monkey also implemented that functionality in his Amp() plug.
(it was part of the base logic of RgbAmplifier, not a special case)
__________________
I sometimes post sober.
StainlessS@MediaFire ::: AND/OR ::: StainlessS@SendSpace

"Some infinities are bigger than other infinities", but how many of them are infinitely bigger ???

Last edited by StainlessS; 2nd March 2022 at 23:22.
StainlessS is offline   Reply With Quote
Old 13th March 2022, 16:36   #11  |  Link
Couleur
Registered User
 
Couleur's Avatar
 
Join Date: Jan 2022
Location: France
Posts: 21
Quote:
Originally Posted by Selur View Post
for the fun of it, I did a small test:
Code:
# Imports
import vapoursynth as vs
# getting Vapoursynth core
core = vs.core
# Loading Plugins
core.std.LoadPlugin(path="I:/Hybrid/64bit/vsfilters/SourceFilter/LSmashSource/vslsmashsource.dll")
# defining beforeDeCross-function - START
def beforeDeCross(clip):
  core.std.LoadPlugin(path="I:/Hybrid/64bit/vsfilters/Support/vs_average.dll")
  
  sourceFps = clip. fps
  divider = 5
  targetFPS  = clip.fps/divider
  
  #convert to YUV420P16
  clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P16, range_s="limited")
  num = clip.num_frames;
  
  out = clip[0]
  count = int(num/divider)
  for i in range(count):
    clips = [None]*divider
    for j in range(divider):
      clips[j]=clip[i+j]
    i = i + divider;
    out =  out + core.average.Mean(clips)
  clip = out
  # colorformat YUV420P16
  # framerate 56
  return clip
# defining beforeDeCross-function - END

# source: 'C:\Users\Selur\Desktop\OMbBiTFQnwnFrNeBAbwb.mkv'
# current color space: YUV420P8, bit depth: 8, resolution: 1920x1080, fps: 280, color matrix: 709, yuv luminance scale: limited, scanorder: progressive
# Loading C:\Users\Selur\Desktop\OMbBiTFQnwnFrNeBAbwb.mkv using LWLibavSource
clip = core.lsmas.LWLibavSource(source="C:/Users/Selur/Desktop/OMbBiTFQnwnFrNeBAbwb.mkv", format="YUV420P8", cache=0, prefer_hw=0)
# Setting color matrix to 709.
clip = core.std.SetFrameProps(clip, _Matrix=1)
clip = clip if not core.text.FrameProps(clip,'_Transfer') else core.std.SetFrameProps(clip, _Transfer=1)
clip = clip if not core.text.FrameProps(clip,'_Primaries') else core.std.SetFrameProps(clip, _Primaries=1)
# Setting color range to TV (limited) range.
clip = core.std.SetFrameProp(clip=clip, prop="_ColorRange", intval=1)
# making sure frame rate is set to 280
clip = core.std.AssumeFPS(clip=clip, fpsnum=280, fpsden=1)
clip = beforeDeCross(clip)
# current meta; color space: YUV420P16, bit depth: 16, resolution: 1920x1080, fps: 56, color matrix: 709, yuv luminance scale: limited, scanorder: progressive
# adjusting output color from: YUV420P16 to YUV420P10 for x265Model
clip = core.resize.Bicubic(clip=clip, format=vs.YUV420P10, range_s="limited")
# set output frame rate to 56.000fps
clip = core.std.AssumeFPS(clip=clip, fpsnum=56, fpsden=1)
# Output
clip.set_output()
-> 56fps.mkv
]
Seems to slow down the video and end prematurely, I can't wrap my head around how you made vs_average work
Couleur is offline   Reply With Quote
Old 13th March 2022, 20:25   #12  |  Link
Selur
Registered User
 
Selur's Avatar
 
Join Date: Oct 2001
Location: Germany
Posts: 7,424
Not totally sure I used vs_average correctly, but here are some comments:
Code:
  sourceFps = clip.fps # fps the interpolated input has
  divider = 5 # set divider by which the frame rate should be lowered
  targetFPS  = clip.fps/divider # calculate target fps
  num = clip.num_frames; # <- set num to the number of frames the clip has
  
  out = clip[0] # <- set out to be the first frame of the clip  (<- this is probably not needed, or the the first frame should be dropped at the end)
  count = int(num/divider) # divide the number of frames through the ammount the frame rate should be lowered to get the output frame count
  for i in range(count): # for each frame
    clips = [None]*divider # create an array which can hold divider times elements (=1-frame clips)
    for j in range(divider): # fill the array
      clips[j]=clip[i+j] # with the frames
    i = i + divider;
    out =  out + core.average.Mean(clips) # merge the 1-frame clips to one frame and add it to the out
  # now out contains all the merged frames (and the first frame of the input clip)
  clip = out
  return clip # return the clip that was just created
hope this helps to understand what I thought, should be done.
(this was the first thing that popped into my mind how to use vs-average to archive what you described)

Cu Selur
__________________
Hybrid here in the forum, homepage
Selur is offline   Reply With Quote
Reply

Tags
averageframes, blending, frame interpolation, interpolation, motion blur

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 10:07.


Powered by vBulletin® Version 3.8.11
Copyright ©2000 - 2024, vBulletin Solutions Inc.