利用者:Nazg-gul/GSoC-2011/Proposal
Camera tracking integration: Proposal
Project description
Make match moving workflow (point tracking, masking, camera tracking and so on) be available in Blender. libmv library would be used for tracking calculation. My main area would be Blender-side interface/operators.
Final result would be a new editor type - «Space Clip». It's necessary for open movie in Blender and organize comfortable navigation/playback (current video viewer isn't so cool for this). For this project it would mainly be used to communicate with tracker data like masks and tracker points. This space clip would be connected with compositor, so it could use pre-processed movie to work with, and this clip could be used in compositor/sequencer to combine 2D movie and 3D scene. Some new compositor nodes would be implemented and be used for lens distortion/undistortion, defining tracker data at needed stage and solving moving.
Bit of details
Inspirational workflow:
- Load image data - video or sequence of images.
- [Optional] Input known camera parameters.
- [Optional] Provide hints to camera lens undistortion solver (identify straight lines in the image) to help with undistorting the image.
- [Optional] Undistort the camera lens using a distortion solver.
- [Optional] Create mask for data you don't want tracked.
- [Optional] adjust color/contrast/etc so that feature points to track have increased contrast and are thus easier for the tracker to find them.
- [Optional] select specific features you want tracked (either by mask or placement of tracking points).
- If you specified specific features to the tracker you may also want to specify how the points should be tracked (bounding box for the tracker to look for the point in; whether the object being tracked is.
- «Rigid» or can deform, and knowledge of the types of camera/object motion - object translation; rotation; camera translation; rotation).
- Send image data, specified track data; camera data; and mask data to tracker.
- Tracker library can automatically identify feature points to track and or use user specified features, ignoring areas that are masked.
- Do a «solve» of the camera motion based on the track points, including statistics on how 'good' the tracks are in contributing to the solve.
- Return track point data and camera solve to software, including the statistical analysis of the track points.
- Based on the statistical analysis pick error thresholds for what track points to automatically delete.
- [Optional] Manually delete any track points.
- [Optional] Create a mask to 'hide' unwanted track points.
- [Optional] Mask can be assigned to follow a set of track points to automatically mask a moving object from the tracker/solver.
- [Optional] mask can be manually keyframed so that it moves and deforms over time to mask a moving object from the tracker/solver.
- [Optional] Provide a manually created camera curve to 'hint' the tracker/solver what you expect the actual curve to look like.
- Retrack if additional tracker points are now needed.
- Pass the tracker points, camera hints, etc. to the camera solver.
- Return the solved camera track and the 3d location of the track points to the software.
- Visualize the camera motion and track points in the 3d view.
- Define a ground plane reference to the 3d track points and camera.
- Define the scene origin relative to the 3d track points and camera.
- Define the world scale and orientation relative to the 3d track points and data.
- Add a test object into the 3d view and see if it stays in the proper location.
- [Optional] Stabilize the camera view based on the solved camera track.
- [Optional] smooth the camera track curve.
- Repeat until the error in the solve is good enough.
Things, that are marked as optional have low priority and would be implemented if it'll be time for this.
Schedule
- Milestone 1: implement space_clip with basic set of operators for opening set of images, opening movies, playback, masking support.
- Milestone 2: integrate libmv into extern/ worktree, add operators to manage trackers in the clip, implement compositor nodes using existing stuff in libmv: lens distortion, 2D tracking.
- Milestone 3: implement as much interface features and operators (with stubs instead of real mv calls) to match match moving workflow.