“My Digital Face” by Shapiro, Suma, Wang, Debevec, Ichikari, et al. …

  • ©Ari Shapiro, Evan A. Suma, Ruizhe Wang, Paul E. Debevec, Ryosuke Ichikari, Graham Fyffe, Andrew W. Feng, Oleg Alexander, and Dan Casas

Conference:


Title:


    My Digital Face

Developer(s):


Project Affiliation:


    USC Institute for Creative Technologies

Description:


    This project puts the capability of producing a photorealistic face into the hands of nearly anyone, without an expensive rig, special hardware, or 3D expertise.

    Using a single commodity depth sensor (Intel RealSense) and a laptop computer, the research team captures several scans of a single face with different expressions. From those scans, a near-automatic pipeline creates a set of blendshapes, which are puppeteered in real time using tracking software. An important stage of the blendshape pipeline is automated to identify and create correspondences between the geometry and textures of different scans, greatly reducing the amount of texture drifting between blendshapes. To expand the amount of control beyond individual shapes, the system can automatically include blendshape masks across various regions of the face in order to mix effects from different parts, resulting in independent control over blinks and lip shapes.

    The results are photorealistic and sufficiently representative of the capture subjects, so they could be used in social media, video conferencing, business communications, and other places where an accurate representation (as opposed to an artistic or stylized one) is desired or appropriate.

    During the demo, the team scans two people who then puppeteer their own faces in real time.


PDF:



ACM Digital Library Publication:



Overview Page:


Type:


E-Tech Type: