Doc:2.6/Manual/Motion Tracking

提供: wiki
移動先: 案内検索


Page status (reviewing guidelines)

Proposed split
Proposed fixes: none

Motion Tracking

Introduction

Motion tracking is a new technique available in Blender. It is still under development, and as of August 2013 supported basic operations for 2D motion tracking, 3D motion tracking, and camera solution. It's already ready to be used in production, as validated by "Tears of Steel."

Getting started

Motion tracking is included with the Blender 2.61 release and later versions. It's enabled by default for all platforms and can be used "out-of-the-box".

Here's brief descriptions of motion tracking tools currently available in Blender

Supervised 2D tracking

There's no common algorithm which can be used for all kinds of footage, feature points and their motions. Such algorithms can be constructed, but they'll be really slow and they can still fail, so the only way to perform 2D tracking is to manually choose the tracking algorithm and its settings. Current defaults work nicely for general footage which isn't very blurry and where feature points aren't getting highly deformed by perspective.

Improving 2D tracking is already in our TODO list, but it's not high priority at this moment. If you aren't sure about algorithms and settings and don't want to read this document, you can just play with settings and find one which works for you.

Manual lens calibration using grease pencil and/or grid

All cameras record distorted video. Nothing can be done about this because of the manner in which optical lenses work. For accurate camera motion, the exact value of the focal length and the "strength" of distortion are needed.

Currently, focal length can be automatically obtained only from the camera's settings or from the EXIF information -- there are no tools inside Blender which can estimate it. But there are some tools which can help to find approximate values to compensate for distortion. There are also fully manual tools where you can use a grid which is getting affected by distortion model and deformed cells defines straight lines in the footage. You can also use the grease pencil for this - just draw line which should be straight on the footage using poly line brush and adjust distortion values to make the grease pencil match lines on the footage.

To calibrate your camera more accurately, use the grid calibration tool from OpenCV. OpenCV is using the same distortion model, so it shouldn't be a problem.

Camera motion solving

Despite the fact that there's no difference in solving camera motion and object motion from a mathematical point of view, only camera solving is currently supported. And it still has some limitations, like unsupported solve of tripod motions or dominant plane motions (where all trackable features belong to one plane). These limitations are planned to be solved in the future.

Basic tools for scene orientation and stabilization

After solve, you need to orient the real scene in the 3D scene for more convenient compositing. There are tools to define the floor, the scene origin, and the X/Y axes to perform scene orientation.

If something is needed to stabilize video from the camera to make the final result looks nicer, 2D stabilization is available to help. It stabilizes video from the camera, which can compensate for camera jumps and tilt.

Basic nodes for compositing scene into real footage

Some new nodes were added to the Compositor to composite scene into footage in easier way. So there are nodes for 2D stabilization, distortion and undistortion which are easy to use.

Not implemented tools

Some tools aren't available in Blender yet, but they are in our TODO list. So there's currently no support for such things as rolling shutter filtering, object motion solving, motion capturing. But you can try to hack this stuff using currently implemented things.

Tools and properties for Motion Tracking

See more details about a tools and properties for motion tracking workflow here.