Humboldt-Universität zu Berlin - Mathematisch-Naturwissenschaftliche Fakultät - Visual Computing

Surface Tracking and Interaction in Texture Space

Elaborate video manipulation in post-production often requires means to move image overlays or corrections along with the apparent motion of an image sequence, a task termed match moving in the visual effects community. The most common commercial tracking tools to extract the apparent motion include manual keyframe warpers, point trackers, dense vector generators (for optical flow), planar trackers, keypoint match-based camera solvers (for rigid motion), and 3D model-based trackers. Many of these tools allow for some kind of user interaction to guide, assist, or improve automatically generated results. However, while increasingly being discussed in the research community, visual effects artists have not yet adopted user interaction with dense optical flowbased estimation methods. We believe, this is due to the technical aims of most proposed tools, their relative complexity of usage, and the difficulty of assessing tracking quality in established result visualizations.

In this work, we introduce the concept of assessment and interaction in texture space for surface tracking applications. We believe the quality of a tracking result can best be assessed on footage mapped to a common reference space. In such a common space, perfect motion estimation is reflected by a perfectly static sequence, while any apparent motion suggests errors in the underlying tracking. Furthermore, this kind of representation allows for the design of tools that are much simpler to use, since even in case of errors, visually related content is usually mapped in close spatial proximity throughout the sequence. Interacting with the tracking algorithms directly and improving the tracking results instead of adjusting the overlay data have the clear advantage of decoupling technical aspects from artistic expression.

 

img_tracking1.jpg
While the Nuke10 Smart Vector tracker (right) is disturbed by occlusions our tool (left) produces clean tracking and warping results visualized by a warping grid.

 

Videos

These videos show a selection of visual effects created with our tracking assessment framework.

 

Publications

J. Furch, A. Hilsmann, P. Eisert,
Surface Tracking Assessment and Interaction in Texture Space, Computational Visual Media, June 2017. [URL]