“Kinect-based Facial Animation”
Conference:
Experience Type(s):
Title:
- Kinect-based Facial Animation
Description:
In this demo we present our system for performance-based character animation that enables any user to control the facial expressions of a digital avatar in realtime. Compared to existing technologies, our system is easy to deploy and does not require any face markers, intrusive lighting, or complex scanning hardware. Instead, the user is recorded in a natural environment using the non-intrusive, commercially available Microsoft Kinect 3D sensor. Since high noise levels in the acquired data prevent conventional tracking methods to work well, we developed a method to combine a database of existing animations with facial tracking to generate compelling animations. Realistic facial tracking facilitates a range of new applications, e.g. in digital gameplay, telepresence or social interactions.
References:
[1]
Li, H., Weise, T., and Pauly, M. 2010. Example-based facial rigging. ACM Trans. Graph. 29, 32:1–32:6.
[2]
Weise, T., Bouaziz, S., Li, H., and Pauly, M. 2011. Real-time performance-based facial animation. ACM Trans. Graph. 30, 77:1–77:10.


