“Data driven cloth animation” by White, Crane and Forsyth

  • ©Ryan White, Keenan Crane, and David A. Forsyth




    Data driven cloth animation



    We present a new method for cloth animation based on data driven synthesis. In contrast to approaches that focus on physical simulation, we animate cloth by manipulating short sequences of existing cloth animation. While our source of data is cloth animation captured using video cameras ([White et al. 2007]), the method is equally applicable to simulation data. The approach has benefits in both cases: current cloth capture is limited because small tweaks to the data require filming an entirely new sequence. Likewise, simulation suffers from long computation times and complications such as tangling. In this sketch we create new animations by fitting cloth animation to human motion capture data, i.e., we drive the cloth with a skeleton.


    1. Sumner, R., and Popović, J. 2004. Deformation transfer for triangle meshes. In SIGGRAPH.
    2. White, R., Crane, K., and Forsyth, D. 2007. Capturing and animating occluded cloth. In SIGGRAPH.

ACM Digital Library Publication:

Overview Page: