“Synth2Track Editor for Efficient Match-Animation” by Moore, Buhmann, Borer and Guay
Conference:
Type(s):
Title:
- Synth2Track Editor for Efficient Match-Animation
Session/Category Title:
- Character Animation Make Some Noise
Presenter(s)/Author(s):
Moderator(s):
Abstract:
A critical step in VFX production is capturing the movement of actors to integrate 3D digital assets into live-action footage. In recent years, advances in regression-based computer vision models such as human detection and motion models have enabled new workflows to emerge where parts of the match-animation process are automated. However, challenging shots that contain ambiguous visual cues, occlusions or complex lighting cause automated systems to fail and users must revert to manual specification, or to the previous generation of semi-automatic tools based on local feature-based tracking (Bergler et al. 2009). Our key insight is that regression models can be used not only at the beginning of the process, but throughout by using manually specified cues. For example, given a partially detected actor, the user can specify a few landmarks manually, which once re-injected into a model, will yield new detections for the rest of the body. Based on this insight, we developed new tools that significantly reduces the time required for complex shots, combining automation with human expertise to overcome the limitations of current computer vision systems. This talk will demonstrate how this approach streamlines VFX workflows and improves tracking accuracy for challenging scenarios.
References:
[1] C. Bregler, K. Bhat, J. Saltzman, and B. Allen. 2009. ILM’s Multitrack: A new visual tracking framework for high-end VFX production. SIGGRAPH Talk (2009).
[2] MoveAI. 2025. https://move-ai-v2.webflow.io/ (2025).
[3] S. Sullivan, C. Davidson, M. Sanders, and K. Wooley. 2006. Three-dimensional motion capture. USPatent US7848564B2 (2006).


