“Interactive control of avatars animated with human motion data” – ACM SIGGRAPH HISTORY ARCHIVES

“Interactive control of avatars animated with human motion data”

  • ©

  • ©

Conference:


Type(s):


Title:

    Interactive control of avatars animated with human motion data

Presenter(s)/Author(s):



Abstract:


    Real-time control of three-dimensional avatars is an important problem in the context of computer games and virtual environments. Avatar animation and control is difficult, however, because a large repertoire of avatar behaviors must be made available, and the user must be able to select from this set of behaviors, possibly with a low-dimensional input device. One appealing approach to obtaining a rich set of avatar behaviors is to collect an extended, unlabeled sequence of motion data appropriate to the application. In this paper, we show that such a motion database can be preprocessed for flexibility in behavior and efficient search and exploited for real-time avatar control. Flexibility is created by identifying plausible transitions between motion segments, and efficient search through the resulting graph structure is obtained through clustering. Three interface techniques are demonstrated for controlling avatar motion using this data structure: the user selects from a set of available choices, sketches a path through an environment, or acts out a desired motion in front of a video camera. We demonstrate the flexibility of the approach through four different applications and compare the avatar motion to directly recorded human motion.

References:


    1. ARIKAN, O., AND FORSYTH, D. A. 2002. Interactive motion generation from examples. In Proceedings of SIGGRAPH 2002. Google Scholar
    2. BADLER, N. I., HOLLICK, M., AND GRANIERI, J. 1993. Real-time control of a virtual human using minimal sensors. Presence 2, 82-86.Google Scholar
    3. BEN-ARIE, J., PANDIT, P., AND RAJARAM, S. S. 2001. Design of a digital library for human movement. In Proceedings of the first ACM/IEEE-CS International Conference on Digital Libraries, 300-309. Google Scholar
    4. BLUMBERG, B. M., AND GALYEAN, T. A. 1995. Multi-level direction of autonomous creatures for real-time virtual environments. In Proceedings of SIGGRAPH 95, 47-54. Google Scholar
    5. BLUMBERG, B. 1998. Swamped! Using plush toys to direct autonomous animated characters. In SIGGRAPH 98 Conference Abstracts and Applications, 109. Google Scholar
    6. BOWDEN, R. 2000. Learning statistical models of human motion. In IEEE Workshop on Human Modelling, Analysis and Synthesis, CVPR2000.Google Scholar
    7. BRADLEY, E., AND STUART, J. 1997. Using chaos to generate choreographic variations. In Proceedings of the Experimental Chaos Conference.Google Scholar
    8. BRAND, M., AND HERTZMANN, A. 2000. Style machines. In Proceedings of SIGGRAPH 2000, 183-192. Google Scholar
    9. BRAND, M. 1999. Shadow puppetry. In IEEE International Conference on Computer Vision, 1237-1244. Google Scholar
    10. BRUDERLIN, A., AND CALVERT, T. W. 1989. Goal-directed, dynamic animation of human walking. In Computer Graphics (Proceedings of SIGGRAPH 89), vol. 23, 233-242. Google Scholar
    11. BRUDERLIN, A., AND CALVERT, T. 1996. Knowledge-driven, interactive animation of human running. In Graphics Interface ’96, 213-221. Google Scholar
    12. BRUDERLIN, A., AND WILLIAMS, L. 1995. Motion signal processing. In Proceedings of SIGGRAPH 95, 97-104. Google Scholar
    13. CASSELL, J., VILHJÁLMSSON, H. H., AND BICKMORE, T. 2001. Beat: The behavior expression animation toolkit. In Proceedings of SIGGRAPH 2001, 477-486. Google Scholar
    14. CASSELL, J. 2000. Embodied conversational interface agents. Communications of the ACM, 4 (April), 70-78. Google Scholar
    15. CHI, D. M., COSTA, M., ZHAO, L., AND BADLER, N. I. 2000. The emote model for effort and shape. In Proceedings of SIGGRAPH 2000, 173-182. Google Scholar
    16. CHOPRA-KHULLAR, S., AND BADLER, N. I. 1999. Where to look? Automating attending behaviors of virtual human characters. In Proceedings of the third annual conference on Autonomous Agents, 16-23. Google Scholar
    17. FALOUTSOS, P., VAN DE PANNE, M., AND TERZOPOULOS, D. 2001. The virtual stuntman: dynamic characters with a repertoire of autonomous motor skills. Computers & Graphics 25, 6 (December), 933-953.Google Scholar
    18. FALOUTSOS, P., VAN DE PANNE, M., AND TERZOPOULOS, D. 2001. Composable controllers for physics-based character animation. In Proceedings of SIGGRAPH 2001, 251-260. Google Scholar
    19. FRALEY, C., AND RAFTERY, A. E. 1998. How many clusters? Which clustering method? Answers via model-based cluster analysis. Computer Journal 41, 8, 578-588.Google Scholar
    20. GALATA, A., JOHNSON, N., AND HOGG, D. 2001. Learning variable length markov models of behaviour. Computer Vision and Image Understanding (CVIU) Journal 81, 3 (March), 398-413. Google Scholar
    21. GLEICHER, M. 1997. Motion editing with spacetime constraints. In 1997 Symposium on Interactive 3D Graphics, ACM SIGGRAPH, 139-148. Google Scholar
    22. GLEICHER, M. 1998. Retargeting motion to new characters. In Proceedings of SIGGRAPH 98, 33-42. Google Scholar
    23. GLEICHER, M. 2001. Comparing constraint-based motion editing methods. Graphical Models 63, 2, 107-123. Google Scholar
    24. HODGINS, J. K., WOOTEN, W. L., BROGAN, D. C., AND O’BRIEN, J. F. 1995. Animating human athletics. In Proceedings of SIGGRAPH 95, 71-78. Google Scholar
    25. HU, M. K. 1962. Visual pattern recognition by moment invariants. IRE Transactions on Information Theory 8, 2, 179-187.Google Scholar
    26. KOVAR, L., GLEICHER, M., AND PIGHIN, F. 2002. Motion graphs. In Proceedings of SIGGRAPH 2002. Google Scholar
    27. LAMOURET, A., AND VAN DE PANNE, M. 1996. Motion synthesis by example. In EGCAS ’96: Seventh International Workshop on Computer Animation and Simulation, Eurographics. Google Scholar
    28. LASZLO, J. F., VAN DE PANNE, M., AND FIUME, E. L. 1996. Limit cycle control and its application to the animation of balancing and walking. In Proceedings of SIGGRAPH 96, 155-162. Google Scholar
    29. LEE, J., AND SHIN, S. Y. 1999. A hierarchical approach to interactive motion editing for human-like figures. In Proceedings of SIGGRAPH 99, 39-48. Google Scholar
    30. LI, Y., WANG, T., AND SHUM, H.-Y. 2002. Motion texture: A two-level statistical model for character synthesis. In Proceedings of SIGGRAPH 2002. Google Scholar
    31. MOLET, T., BOULIC, R., AND THALMANN, D. 1996. A real-time anatomical convertex for human motion capture. In EGCAS ’96: Seventh International Workshop on Computer Animation and Simulation, Eurographics. Google Scholar
    32. MOLET, T., AUBEL, T., GAPIN, T., CARION, S., AND LEE, E. 1999. Anyone for tennis? Presence 8, 2, 140-156. Google Scholar
    33. MOLINA TANCO, L., AND HILTON, A. 2000. Realistic synthesis of novel human movements from a database of motion capture examples. In Proceedings of the Workshop on Human Motion, 137-142. Google Scholar
    34. OXFORD METRIC SYSTEMS, 2002. www.vicon.com.Google Scholar
    35. PERLIN, K., AND GOLDBERG, A. 1996. Improv: A system for scripting interactive actors in virtual worlds. In Proceedings of SIGGRAPH 96, 205-216. Google Scholar
    36. PERLIN, K. 1995. Real time responsive animation with personality. IEEE Transactions on Visualization and Computer Graphics 1, 1 (Mar.), 5-15. Google Scholar
    37. PULLEN, K., AND BREGLER, C. 2000. Animating by multi-level sampling. In Computer Animation 2000, IEEE CS Press, 36-42. Google Scholar
    38. PULLEN, K., AND BREGLER, C. 2002. Motion capture assisted animation: Texturing and synthesis. In Proceedings of SIGGRAPH 2002. Google Scholar
    39. ROSALES, R., ATHITSOS, V., SIGAL, L., AND SCLAROFF, S. 2001. 3D hand pose reconstruction using specialized mappings. In IEEE International Conference on Computer Vision, 378-385.Google Scholar
    40. ROSE, C., COHEN, M. F., AND BODENHEIMER, B. 1998. Verbs and adverbs: Multidimensional motion interpolation. IEEE Computer Graphics & Applications 18, 5 (September – October), 32-40. Google Scholar
    41. SARCOS, 2002. www.sarcos.com.Google Scholar
    42. SCHÖDL, A., SZELISKI, R., SALESIN, D. H., AND ESSA, I. 2000. Video textures. In Proceedings of SIGGRAPH 2000, 489-498. Google Scholar
    43. SEMWAL, S., HIGHTOWER, R., AND STANSFIELD, S. 1998. Mapping algorithms for real-time control of an avatar using eight sensors. Presence 7, 1, 1-21. Google Scholar
    44. SIDENBLADH, H., BLACK, M. J., AND SIGAL, L. 2002. Implicit probabilistic models of human motion for synthesis and tracking. In European Conference on Computer Vision (ECCV). Google Scholar
    45. SUN, H. C., AND METAXAS, D. N. 2001. Automating gait animation. In Proceedings of SIGGRAPH 2001, 261-270. Google Scholar
    46. TARJAN, R. 1972. Depth first search and linear graph algorithms. SIAM Journal of Computing 1, 146-160.Google Scholar
    47. UNUMA, M., ANJYO, K., AND TAKEUCHI, R. 1995. Fourier principles for emotion-based human figure animation. In Proceedings of SIGGRAPH 95, 91-96. Google Scholar
    48. WILEY, D., AND HAHN, J. K. 1997. Interpolation synthesis of articulated figure motion. IEEE Computer Graphics and Applications 17, 6 (November), 39-45. Google Scholar
    49. WITKIN, A. P., AND POPOVIĆ, Z. 1995. Motion warping. In Proceedings of SIGGRAPH 95, 105-108. Google Scholar
    50. WOOTEN, W. L., AND HODGINS, J. K. 1996. Animation of human diving. Computer Graphics Forum 15, 1, 3-14.Google Scholar


ACM Digital Library Publication:



Overview Page: