Blender Camera Tracking Test

A couple of years ago Blender added the ability to do camera motion tracking of footage to help in adding VFX into a recorded scene. Last year in between projects I decided to see what Blender offered. I downloaded a plate from and started the process of tracking a particular piece of footage.

Motion tracking is a very common thing to do in the field of VFX and motion graphics. I’ve done it many times before in After Effects, so this was the first go round with Blender.

I find Blender’s implementation of camera tracking to be great and for open source you can’t beat the price, free. If you own a copy of After Effects, you’ve got Mocha already, there are others too.

Here is the result of a few hours worth of work, modeling, texture, lighting, tracking, and of course putting it all together in After Effects. If you want to read about the slightly more technical aspect to this, you can read it below.

About the scene:

I wasn’t going for anything mind blowing. I just wanted to get some tests together and see how blender does its tracking. I made the background using some basic models that I extruded and shaped and slapped some textures on. I then did a quick sculpture of the figures on the pedestal and dropped it into the scene.

I lit the background, added some ambient occlusion, no GI, and it took roughly 15-30 secs per frame to render. I rendered out the background and the foreground pedestal separately.

Once they were finished I cleaned up the green screen footage in After Effects. Imported the background and foreground image sequences, added the fog, and did some tone mapping and color correction, and called it done.