Skip to content

f0e/blur

Repository files navigation

Blur

Blur is a native desktop application made for easily and efficiently adding motion blur to videos through frame blending, with the ability to utilise frame interpolation and more.

YouTube showcase

Download

macOS note*: After opening on Mac for the first time you'll get a 'Blur is damaged and can't be opened.' error. To fix this, run xattr -dr com.apple.quarantine /Applications/blur.app to unquarantine it.

Features

The amount of motion blur is easily configurable, and there are additional options to enable other features such as interpolating the video's fps. This can be used to generate 'fake' motion blur through frame blending the interpolated footage. This motion blur does not blur non-moving parts of the video, like the HUD in gameplay footage.

The program can also be used in the command line via blur-cli, use -h or --help for more information.

Sample output

600fps footage, blurred with 0.6 blur amount

600fps footage, blurred with 0.6 blur amount

60fps footage, interpolated to 600fps, blurred with 0.6 blur amount

60fps footage, interpolated to 600fps, blurred with 0.6 blur amount

As visible from these images, the interpolated 60fps footage produces motion blur that is comparable to actual 600fps footage.


Recommended settings for gameplay footage:

Most of the default settings are what I find work the best, but some settings can depend on your preferences.

Blur amount

For 60fps footage:

intent amount
Maximum blur/smoothness >1
Normal blur 1
Medium blur 0.5
Low blur 0.2-0.3

To preserve your old blur amount when changing framerate use the following formula:

[new blur amount] = [old blur amount] × ([new fps] / [old fps])

So normal blur at 30fps becomes 0.5, etc.

Interpolated fps

Results can become worse if this is too high. In general I recommend around 5x the input fps. Also SVP seems to only be able to interpolate up to 10x the input fps, so don't bother trying anything higher than that.

Notes

Limiting smearing

Using blur on 60fps footage results in clean motion blur, but occasionally leaves some smearing artifacts. To remove these artifacts, higher framerate source footage can be used. Recording with software such as OBS at framerates like 120/180fps will result in a greatly reduced amount of artifacting.

Preventing unsmooth output

If your footage contains duplicate frames then occasionally blurred frames will look out of place, making the video seem unsmooth at points. The 'deduplicate' option will automatically fill in duplicated frames with interpolated frames to prevent this from happening.

Frameserver output

Blur supports rendering from frameservers. This means you can avoid having to run blur on your input videos when video editing. When rendering, simply output (make sure your project is high framerate) to the frameserver and then drag the generated AVI into blur. Note that some video editing software might limit the maximum project framerate.

Config settings explained:

blur

  • blur - whether or not the output video file will have motion blur
  • blur amount - if blur is enabled, this is the amount of motion blur (0 = no blur, 1 = fully blend every frame together, 1+ = more blur/ghosting)
  • blur output fps - if blur is enabled, this is the fps the output video will be. can be a framerate (e.g. 600) or a multiplier (e.g. 5x)
  • blur weighting - weighting function to use when blending frames. options are listed below:
    • equal - each frame is blended equally
    • gaussian
    • gaussian_sym
    • pyramid
    • pyramid_sym
    • custom weights - custom frame weights, e.g. [5, 3, 3, 2, 1]. higher numbers indicate frames being more visible when blending, lower numbers mean they are less so.
    • custom function - generate weights based off of custom python code, which is called for each frame 'x', e.g. -x**2+1

interpolation

  • interpolate - whether or not the input video file will be interpolated to a higher fps
  • interpolated fps - if interpolate is enabled, this is the fps that the input file will be interpolated to (before blending)

rendering

  • quality - crf of the output video (qp if using GPU encoding)
  • deduplicate - removes duplicate frames and generates new interpolated frames to take their place
  • preview - opens a render preview window
  • detailed filenames - adds blur settings to generated filenames

gpu acceleration

  • gpu decoding - uses gpu when decoding
  • gpu interpolation - uses gpu when interpolating
  • gpu encoding - uses gpu when rendering
  • gpu type (nvidia/amd/intel) - your gpu type

timescale

  • input timescale - timescale of the input video file (will be sped up/slowed down accordingly)
  • output timescale - timescale of the output video file
  • adjust timescaled audio pitch - will pitch shift audio when sped up/slowed down

filters

  • brightness - brightness of the output video
  • saturation - saturation of the output video
  • contrast - contrast of the output video

advanced rendering

  • video container - the output video container
  • custom ffmpeg filters - custom ffmpeg filters to be used when rendering (replaces gpu & quality options)
  • debug - shows debug window, prints commands used by blur

advanced blur

  • blur weighting gaussian std dev - standard deviation used in the gaussian weighting
  • blur weighting triangle reverse - reverses the direction of the triangle weighting
  • blur weighting bound - weighting bounds, spreads out weights more

advanced interpolation

  • interpolation preset - preset used for framerate interpolation, one of:

  • interpolation algorithm - algorithm used for framerate interpolation, one of:

  • interpolation block size - block size used for framerate interpolation. higher block size = less accurate blur, will result in spaces around non-moving objects of the frame, also renders faster. lower block size = more accurate blur, but can result in artifacting, also slower. for higher framerate input videos lower block size can be better. options:

    • 4
    • 8 (default)
    • 16
    • 32
  • interpolation mask area - mask amount used when interpolating. higher values can mean static objects are blurred less, but can also result in less smooth output (moving parts of the image can be mistaken for static parts and don't get blurred)

manual svp override

You can customise the SVP interpolation settings even further by manually defining json parameters. see here for explanations on settings

  • manual svp: enables manual svp settings, true/false
  • super string: json string used as input in SVSuper
  • vectors string: json string used as input in SVAnalyse
  • smooth string: json string used as input in SVSmoothFps

These options are not visible by default, add them to your config and they will be used.


*in the future I might buy a dev cert, but $99 a year atm doesn't seem worth it 😅