“Pumping Life: Embodied Virtual Companion for Enhancing Immersive Experience with Multisensory Feedback” by Huang, Hung, Hsu, Liao and Han – ACM SIGGRAPH HISTORY ARCHIVES

“Pumping Life: Embodied Virtual Companion for Enhancing Immersive Experience with Multisensory Feedback” by Huang, Hung, Hsu, Liao and Han

  • 2019 SA VR_Huang_Pumping Life-Virtual Companion with Multisensory Feedback

Conference:


Experience Type(s):


Title:


    Pumping Life: Embodied Virtual Companion for Enhancing Immersive Experience with Multisensory Feedback

Presenter(s):



Description:


    With the advance of virtual reality (VR) head-mounted display, the appearance of the virtual companion can be more realistic and full of vitality, such as breathing and facial expression. However, the users cannot interact physically with the companions due to they do not have a physical body. In this work, our goal is to enable the virtual companion with multisensory feedback in the VR, which allows the users to play with the virtual companion physically in the immersive environment. We present Pumping Life, a dynamic flow system for enhancing the virtual companion with multisensory feedback, which utilizes water pumps and heater to provide shape deformation and thermal feedback. In this work, to show the interactive gameplay with our system, we deploy the system into a teddy bear and design a VR role-playing game. In this game, the player needs to collaborate with the teddy bear to complete the mission, which would perceive the vitality and expression of the teddy bear with multiple tactile sensations.

References:


    [1] Yuki Ban, Hiroyuki Karasawa, Rui Fukui, and Shin’ ichi Warisawa. 2018. Relaxushion: controlling the rhythm of breathing for relaxation by overwriting somatic sensation. In SIGGRAPH Asia 2018 Emerging Technologies. ACM, 10.
    [2] Lung-Pan Cheng, Li Chang, Sebastian Marwecki, and Patrick Baudisch. 2018. iTurk: Turning Passive Haptics into Active Haptics by Making Users Reconfigure Props in Virtual Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 89.
    [3] Lung-Pan Cheng, Sebastian Marwecki, and Patrick Baudisch. 2017. Mutual Human Actuation. In Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology. ACM, 797–805.
    [4] Ping-Hsuan Han, Yang-Sheng Chen, Kong-Chang Lee, Hao-Cheng Wang, Chiao-En Hsieh, Jui-Chun Hsiao, Chien-Hsing Chou, and Yi-Ping Hung. 2018. Haptic around: multiple tactile sensations for immersive environment and interaction in virtual reality. In Proceedings of the 24th ACM Symposium on Virtual Reality Software and Technology. ACM, 35.
    [5] Rex Hsieh, Yuya Mochizuki, Takaya Asano, Marika Higashida, and Akihiko Shirai. 2017. ”Real baby – real family”: VR entertainment baby interaction system. In ACM SIGGRAPH 2017 Emerging Technologies. ACM, 20.
    [6] Ken Nakagaki, Artem Dementyev, Sean Follmer, Joseph A Paradiso, and Hiroshi Ishii. 2016. Chainform: A linear integrated modular hardware system for shape changing interfaces. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology. ACM, 87–96.


ACM Digital Library Publication:


Overview Page:



Submit a story:

If you would like to submit a story about this experience or presentation, please contact us: historyarchives@siggraph.org