“Performance-based control interface for character animation” by Ishigaki, White, Zordan and Liu

  • ©

Conference:


Type(s):


Title:

    Performance-based control interface for character animation

Presenter(s)/Author(s):



Abstract:


    Most game interfaces today are largely symbolic, translating simplified input such as keystrokes into the choreography of full-body character movement. In this paper, we describe a system that directly uses human motion performance to provide a radically different, and much more expressive interface for controlling virtual characters. Our system takes a data feed from a motion capture system as input, and in real-time translates the performance into corresponding actions in a virtual world. The difficulty with such an approach arises from the need to manage the discrepancy between the real and virtual world, leading to two important subproblems 1) recognizing the user’s intention, and 2) simulating the appropriate action based on the intention and virtual context. We solve this issue by first enabling the virtual world’s designer to specify possible activities in terms of prominent features of the world along with associated motion clips depicting interactions. We then integrate the prerecorded motions with online performance and dynamic simulation to synthesize seamless interaction of the virtual character in a simulated virtual world. The result is a flexible interface through which a user can make freeform control choices while the resulting character motion maintains both physical realism and the user’s personal style.

References:


    1. Arikan, O., and Forsyth, D. A. 2002. Synthesizing constrained motions from examples. ACM Trans. on Graphics 21, 3, 483–490.Google ScholarDigital Library
    2. Chai, J., and Hodgins, J. K. 2005. Performance animation from low-dimensional control signals. ACM Trans. on Graphics 24, 3, 686–696. Google ScholarDigital Library
    3. CMU Motion Capture Database, http://mocap.cs.cmu.edu/.Google Scholar
    4. Darrell, T., and Pentland, A. 1993. Space-time gestures. In CVPR, 335–340.Google Scholar
    5. Dontcheva, M., Yngve, G., and Popović, Z. 2003. Layered acting for character animation. ACM Trans. on Graphics 22, 3, 409–416. Google ScholarDigital Library
    6. Gleicher, M. 1997. Motion editing with spacetime constraints. In Symposium on Interactive 3D Graphics, 139–148. Google ScholarDigital Library
    7. Grochow, K., Martin, S. L., Hertzmann, A., and Popović, Z. 2004. Style-based inverse kinematics. ACM Trans. on Graphics 23, 3, 522–531. Google ScholarDigital Library
    8. Igarashi, T., Moscovich, T., and Hughes, J. F. 2005. Spatial keyframing for performance-driven animation. In Eurographics, 107–115. Google ScholarDigital Library
    9. Keogh, E., Palpanas, T., Zordan, V., Gunopulos, D., and Cardle, M. 2004. Indexing large human-motion databases.Google Scholar
    10. Kovar, L., Gleicher, M., and Pighin, F. 2002. Motion graphs. ACM Trans. on Graphics 21, 3, 473–482. Google ScholarDigital Library
    11. Laszlo, J., van de Panne, M., and Fiume, E. L. 2000. Interactive control for physically-based animation. In ACM SIGGRAPH, 201–208. Google ScholarDigital Library
    12. Lee, J., and Shin, S. Y. 1999. A hierarchical approach to interactive motion editing for human-like figures. In ACM SIGGRAPH. Google ScholarDigital Library
    13. Lee, J., Chai, J., Reitsma, P. S. A., Hodgins, J. K., and Pollard, N. S. 2002. Interactive control of avatars animated with human motion data. ACM Trans. on Graphics 21, 3, 491–500. Google ScholarDigital Library
    14. Müller, M., Röder, T., and Clausen, M. 2005. Efficient content-based retrieval of motion capture data. ACM Trans. on Graphics 24, 3, 677–685. Google ScholarDigital Library
    15. Neff, M., Albrecht, I., and Seidel, H.-P. 2007. Layered performance animation with correlation maps. Computer Graphics Forum 26, 3, 675–684.Google ScholarCross Ref
    16. Oore, S., Terzopoulos, D., and Hinton, G. 2002. Local physical models for interactive character animation. Computer Graphics Forum 21, 3, 337–326.Google ScholarCross Ref
    17. Popović, Z., and Witkin, A. 1999. Physically based motion transformation. In ACM SIGGRAPH, 11–20. Google ScholarDigital Library
    18. Shin, H. J., Lee, J., Gleicher, M., and Shin, S. Y. 2001. Computer puppetry: An importance-based approach. ACM Trans. on Graphics 20, 2, 67–94. Google ScholarDigital Library
    19. Shiratori, T., and Hodgins, J. K. 2008. Accelerometer-based user interfaces for the control of a physically simulated character. ACM Trans. on Graphics 27, 5, 123:1–123:9. Google ScholarDigital Library
    20. Thorne, M., Burke, D., and van de Panne, M. 2004. Motion doodles: an interface for sketching character motion. ACM Trans. on Graphics, 424–431. Google ScholarDigital Library
    21. Witkin, A., and Popović, Z. 1995. Motion warping. In ACM SIGGRAPH. Google ScholarDigital Library
    22. Yamato, J., Ohya, J., and Ishii, K. 1992. Recognizing human action in time-sequential images using hidden markov model. In ICCV, 379–385.Google Scholar
    23. Yin, K., and Pai, D. 2003. FootSee: an interactive animation system. In ACM SIGGRAPH/Eurographics symposium on Computer animation, 329–338. Google ScholarDigital Library
    24. Zhao, P., and van de Panne, M. 2005. User interfaces for interactive control of physics-based 3d characters. In Symposium on Interactive 3D graphics and games, 87–94. Google ScholarDigital Library


ACM Digital Library Publication:



Overview Page: