Dev:2.4/Source/Development/Projects/Motion tracking/MotionTracker2D

提供: wiki
< Dev:2.4‎ | Source‎ | Development‎ | Projects‎ | Motion tracking
(差分) ← 古い版 | 最新版 (差分) | 新しい版 → (差分)
移動先: 案内検索

Objective

Bring simple 2D tracking capabilities to Blender.

Goals

  • Support semi-automatic tracking of video in blender, such that rendered items can match the movement of moving elements in a video.

Non-goals

  • Automatic selection of track points.
  • Any kind of 3D support (that's for later). The tracking will only be useful for things moving in the video plane.
  • Tracking anything other than points (for example, no line tracking).

Only 2d?

The primary goal of libmv is to create a full matchmoving component for Blender. However, a matchmover is a large project. Creating a full matchmover and integrating it with a single massive merge is likely to cause headaches for all involved. A better approach is to build the functionality incrementally.

While there are many parts in a matchmoving program, most of them naturally fall into two categories: 2D and 3D. Before moving to a 3D reconstruction, first 2D tracks must be made of the video to be reconstructed. There are a variety of ways of tracking in 2D. The most basic method is manual tracking, where in each frame the tracked point is moved by the user into the right position. This method works, but is laborious. At the more advanced level, there are automatic methods that use image information (such as a distinct marking on a car) to track something on the screen. In between, there are semi-automatic methods where the user guides the automatic tracking process.

Some example uses of 2D tracking without 3D:

To see a possible 2D tracking interface, check out these tutorials from syntheyes.

Proposed usage

Here's what I suggest for workflow. I'd like for experts in blender or in tracking to comment on this.

  • Open blender.
  • Open the video to track and select it.
  • Open the tracking dialog
    • Select the 'add tracker' button.
    • Click on something to track (e.g., a person's nose)
    • A small box is placed around the track point, indicating that the tracker is placed.
    • Click the 'track across frames' button and watch the tracker move with the object in the video.
    • Scrub through the video; note that on frame 25 the tracker gets lost
    • Drag the tracker to the correct position in frame 25
    • Advance to frame 26, and correct the position again.
    • With the tracker selected, press 're-track selected' in the tracker dialog. Watch as the tracker tracks correctly through the remainder of the sequence.
  • Click the 'export tracks to empties and make camera' button, which recreates the scene in blender. The track becomes an empty with IPO curves to make its movement match the scene movement. The camera is static, but positioned so that it the empty projects to the same position it is in the video.
  • Switch to 3D view, make a monkey.
  • Parent the monkey to the empty that was created by the tracker.
  • Compose the render with the video; render; watch the monkey move around with the video (in 2D!)

UI Mocks

A mock for the 2D tracker panel
A mock for the video window for the 2D tracker. Note the list of frames at the top where we can display which frames are cached in memory and which are not.

Example session

François Tarlier made a demo showing some of the functionality that we may want to target for a 2D tracker in Blender.

This shows how to load footage and add add some trackers. Roughly, what the video shows is the following:

  1. Load footage.
  2. Check the footage settings.
  3. Load the footage in RAM.
  4. Create a new tracker on video.
  5. Set the size of the tracker.
  6. Add some setting as "stop on auto-key" , "pre-roll on key smooth" , "pan to follow" (I can give you more info about those options if you need it).
  7. Start tracking (will stop on every 10 frame in this example)
  8. Replace tracking if needed at each created key, and continue tracking.
  9. Check the "Graph Editor" to find weird spikes in the tracker. This can show problems with individual trackers.

Open Questions

There are a number of open question. Comments from blender developers and matchmovers appreciated.

  • Where do the data structures for the tracker live? On one hand, they can probably be normal blender objects if tracking is a one-shot process where the movie is tracked, and the results baked out to empties, with no support for saving the intermediate tracks. Alternately, it would be nice to support editing the tracks after they are put into 3D. I am not sure what the right approach is here; comments from blender coders welcome.
  • Should the compositing steps be done automatically to make it easier for new users?
  • Where should the tracker dialog go? Should it be a dialog or a new button panel?
  • Should there be a separate tracking window which shows the video, the trackers, and the scrubber?
  • Syntheyes shows a closeup of the selected tracker to make it easier to manually adjust a tracker. This is a great idea, we should do the same. But where should this go? In the button panel?
  • Is this the most useful functional subset of 2D tracking?