“Generalizing wave gestures from sparse examples for real-time character control” by Rhodin, Tompkin, Kim, de Aguiar, Pfister, et al. … – ACM SIGGRAPH HISTORY ARCHIVES

“Generalizing wave gestures from sparse examples for real-time character control” by Rhodin, Tompkin, Kim, de Aguiar, Pfister, et al. …

  • 2015 SA Technical Papers_Rhodin_Generalizing Wave Gestures from Sparse Examples

Conference:


Type(s):


Title:

    Generalizing wave gestures from sparse examples for real-time character control

Session/Category Title:   Faces and Characters


Presenter(s)/Author(s):



Abstract:


    Motion-tracked real-time character control is important for games and VR, but current solutions are limited: retargeting is hard for non-human characters, with locomotion bound to the sensing volume; and pose mappings are ambiguous with difficult dynamic motion control. We robustly estimate wave properties —amplitude, frequency, and phase—for a set of interactively-defined gestures by mapping user motions to a low-dimensional independent representation. The mapping separates simultaneous or intersecting gestures, and extrapolates gesture variations from single training examples. For animations such as locomotion, wave properties map naturally to stride length, step frequency, and progression, and allow smooth transitions from standing, to walking, to running. Interpolating out-of-phase locomotions is hard, e.g., quadruped legs between walks and runs switch phase, so we introduce a new time-interpolation scheme to reduce artifacts. These improvements to real-time motion-tracked character control are important for common cyclic animations. We validate this in a user study, and show versatility to apply to part- and full-body motions across a variety of sensors.

References:


    1. Akhter, I., Sheikh, Y., Khan, S., and Kanade, T. 2010. Trajectory space: a dual representation for nonrigid structure from motion. IEEE TPAMI 33, 7, 1442–1456.
    2. Akhter, I., Simon, T., Khan, S., Matthews, I., and Sheikh, Y. 2012. Bilinear spatiotemporal basis models. ACM TOG 31, 2, 1–12.
    3. Arasaratnam, I., and Haykin, S. 2009. Cubature Kalman filters. IEEE Trans. Automatic Control 54, 6, 1254–1269.
    4. Ashraf, G., and Wong, K. C. 2000. Generating Consistent Motion Transition via Decoupled Framespace Interpolation. CGF 19, 3, 447–456.
    5. Baran, I., Vlasic, D., Grinspun, E., and Popović, J. 2009. Semantic deformation transfer. ACM TOG (Proc. SIGGRAPH) 28, 3, 36:1–36:6.
    6. Bregler, C., Loeb, L., Chuang, E., and Deshpande, H. 2002. Turning to the masters: Motion capturing cartoons. ACM TOG (Proc. SIGGRAPH) 21, 3, 399–407.
    7. Casas, D., Tejera, M., Guillemaut, J.-Y., and Hilton, A. 2012. 4D parametric motion graphs for interactive animation. In Proc. I3D, 103–110.
    8. Cashman, T. J., and Hormann, K. 2012. A continuous, editable representation for deforming mesh sequences with separate signals for time, pose and shape. CGF (Proc. Eurographics) 31, 2pt4, 735–744.
    9. Celikcan, U., Yaz, I. O., and Capin, T. 2014. Example-based retargeting of human motion to arbitrary mesh models. CGF 34, 1, 216–227.
    10. Chai, J., and Hodgins, J. K. 2005. Performance animation from low-dimensional control signals. ACM TOG (Proc. SIGGRAPH) 24, 686–696.
    11. Chen, J., Izadi, S., and Fitzgibbon, A. 2012. KinÊtre: animating the world with the human body. In Proc. UIST, 435–444.
    12. Coros, S., Martin, S., Thomaszewski, B., Schumacher, C., Sumner, R., and Gross, M. 2012. Deformable objects alive! ACM TOG (Proc. SIGGRAPH) 31, 4, 69:1–69:9.
    13. Dontcheva, M., Yngve, G., and Popović, Z. 2003. Layered acting for character animation. ACM TOG (Proc. SIGGRAPH), 409–416.
    14. Feichtinger, H. G., and Strohmer, T. 1998. Gabor analysis and algorithms: Theory and applications. Springer Science & Business Media.
    15. Gleicher, M. 1998. Retargetting motion to new characters. In Proc. SIGGRAPH, ACM, 33–42.
    16. Ha, D., and Han, J. 2008. Motion synthesis with decoupled parameterization. Vis. Comput. 24, 7-9, 587–594.
    17. Heck, R., and Gleicher, M. 2007. Parametric motion graphs. In Proc. I3D, 129–136.
    18. Hecker, C., Raabe, B., Enslow, R. W., DeWeese, J., Maynard, J., and van Prooijen, K. 2008. Real-time motion retargeting to highly varied user-created morphologies. ACM TOG (Proc. SIGGRAPH) 27, 3, 27:1–27:11.
    19. Held, R., Gupta, A., Curless, B., and Agrawala, M. 2012. 3d puppetry: A kinect-based interface for 3d animation. Proc. UIST, 423–434.
    20. Huang, T.-C., Huang, Y.-J., and Lin, W.-C. 2013. Real-time horse gait synthesis. Computer Animation and Virtual Worlds 24, 2, 87–95.
    21. Ijspeert, A. J., Nakanishi, J., Hoffmann, H., Pastor, P., and Schaal, S. 2013. Dynamical movement primitives: Learning attractor models for motor behaviors. Neural Comput. 25, 2, 328–373.
    22. Ishigaki, S., White, T., Zordan, V. B., and Liu, C. K. 2009. Performance-based control interface for character animation. ACM TOG (Proc. SIGGRAPH) 28, 3, 61:1–61:8.
    23. Johnson, M. P., Wilson, A., Blumberg, B., Kline, C., and Bobick, A. 1999. Sympathetic interfaces: Using a plush toy to direct synthetic characters. In Proc. CHI, 152–158.
    24. Keogh, E., and Pazzani, M. 2001. Derivative dynamic time warping. In Proc. SIAM SDM, 5–7.
    25. Kovar, L., Gleicher, M., and Pighin, F. 2002. Motion graphs. ACM TOG (Proc. SIGGRAPH), 473–482.
    26. Kwon, Y., Kim, K. I., Tompkin, J., Kim, J. H., and Theobalt, C. 2015. Efficient learning of image superresolution and compression artifact removal with semi-local Gaussian processes. IEEE TPAMI 37, 9, 1792–1805.
    27. Laszlo, J., van de Panne, M., and Fiume, E. 2000. Interactive control for physically-based animation. Proc. SIGGRAPH, 201–208.
    28. Lawrence, N. D. 2004. Gaussian process latent variable models for visualisation of high dimensional data. Proc. NIPS, 329–336.
    29. Lee, C., and Elgammal, A. 2004. Gait style and gait content: bilinear models for gait recognition using gait re-sampling. Proc. FG, 147–152.
    30. Lee, J., Chai, J., Reitsma, P. S., Hodgins, J. K., and Pollard, N. S. 2002. Interactive control of avatars animated with human motion data. In ACM TOG (Proc. SIGGRAPH), vol. 21, ACM, 491–500.
    31. Lee, Y., Wampler, K., Bernstein, G., Popović, J., and Popović, Z. 2010. Motion fields for interactive character locomotion. ACM TOG (Proc. SIGGRAPH Asia) 29, 6, 138:1–138:8.
    32. Levine, S., Wang, J. M., Haraux, A., Popović, Z., and Koltun, V. 2012. Continuous character control with low-dimensional embeddings. ACM TOG (Proc. SIGGRAPH) 31, 4, 1–10.
    33. Lockwood, N., and Singh, K. 2012. Finger walking: motion editing with contact-based hand performance. In Proc. SCA, 43–52.
    34. Martin, T., and Neff, M. 2012. Interactive quadruped animation. In Proc. Motion in Games, 208–219.
    35. Neff, M., Albrecht, I., and Seidel, H.-P. 2007. Layered performance animation with correlation maps. CGF 26, 3, 675–684.
    36. Oore, S., Terzopoulos, D., and Hinton, G. 2002. A desktop input device and interface for interactive 3D character animation. Proc. Graphics Interface, 133–140.
    37. Pullen, K., and Bregler, C. 2000. Animating by multi-level sampling. In Proc. CA, 36–42.
    38. Raptis, M., Kirovski, D., and Hoppe, H. 2011. Real-time classification of dance gestures from skeleton animation. In Proc. SCA, 147–156.
    39. Rasmussen, C. E., and Williams, C. K. I. 2006. Gaussian Processes for Machine Learning. MIT Press, Cambridge, MA.
    40. Rhodin, H., Tompkin, J., Kim, K. I., Varanasi, K., Seidel, H.-P., and Theobalt, C. 2014. Interactive motion mapping for real-time character control. CGF (Proc. Eurographics) 33, 2.
    41. Rose, C., Cohen, M. F., and Bodenheimer, B. 1998. Verbs and adverbs: multidimensional motion interpolation. IEEE CG&A 18, 5, 32–40.
    42. Seol, Y., O’Sullivan, C., and Lee, J. 2013. Creature features: online motion puppetry for non-human characters. In Proc. SCA, 213–221.
    43. Shin, H. J., and Lee, J. 2006. Motion synthesis and editing in low-dimensional spaces. Comput. Animat. Virtual Worlds 17, 3-4, 219–227.
    44. Shin, H. J., Lee, J., Shin, S. Y., and Gleicher, M. 2001. Computer puppetry: An importance-based approach. ACM TOG 20, 67–94.
    45. Shiratori, T., and Hodgins, J. K. 2008. Accelerometer-based user interfaces for the control of a physically simulated character. ACM TOG (Proc. SIGGRAPH Asia) 27, 5, 123:1–123:9.
    46. Sturman, D. J. 1998. Computer puppetry. IEEE CG&A 18, 1, 38–45.
    47. Sumner, R. W., Zwicker, M., Gotsman, C., and Popović, J. 2005. Mesh-based inverse kinematics. ACM TOG (Proc. SIGGRAPH) 24, 3, 488–495.
    48. Sung, M. 2013. Fast motion synthesis of quadrupedal animals using a minimum amount of motion capture data. ETRI Journal 35, 6, 1029–1037.
    49. Terra, S. C. L., and Metoyer, R. A. 2004. Performance timing for keyframe animation. In Proc. SCA, 253–258.
    50. Tomlinson, B., Downie, M., Berlin, M., Gray, J., Lyons, D., Cochran, J., and Blumberg, B. 2002. Leashing the alphawolves: Mixing user direction with autonomous emotion in a pack of semi-autonomous virtual characters. In Proc. SCA, 7–14.
    51. Unuma, M., Anjyo, K., and Takeuchi, R. 1995. Fourier principles for emotion-based human figure animation. In Proc. SIGGRAPH, 91–96.
    52. Vögele, A., Hermann, M., Krüger, B., and Klein, R. 2012. Interactive steering of mesh animations. In Proc. SCA, 53–58.
    53. Weise, T., Bouaziz, S., Li, H., and Pauly, M. 2011. Realtime performance-based facial animation. ACM TOG (Proc. SIGGRAPH) 30, 4.
    54. Xiong, X., and De la Torre, F. 2013. Supervised descent method and its applications to face alignment. In Proc. CVPR, 532–539.
    55. Yamane, K., Ariki, Y., and Hodgins, J. 2010. Animating non-humanoid characters with human motion data. In Proc. SCA, 169–178.


ACM Digital Library Publication:



Overview Page:



Submit a story:

If you would like to submit a story about this presentation, please contact us: historyarchives@siggraph.org