“Lotus: Enhancing the Immersive Experience in Virtual Environment with Mist-based Olfactory Display” – ACM SIGGRAPH HISTORY ARCHIVES

“Lotus: Enhancing the Immersive Experience in Virtual Environment with Mist-based Olfactory Display”

Conference:


Experience Type(s):


Title:


    Lotus: Enhancing the Immersive Experience in Virtual Environment with Mist-based Olfactory Display

Description:


    With the advance of virtual reality (VR) headset and haptic technologies, users can have a great experience when they are immersed in the virtual environment (VE). Although many research have tried to enable the user perceive the visual, auditory and haptic feedback in an immersive VR, a fewer of them allow users to perceive the odor from the VE simultaneously. Based on the five senses, olfaction is one of the human sense that can perceive chemical information from the environment, which is also important for recreating the VE. In the past, some research groups have shown the techniques of olfactory display. However, to create the olfactory feedback for immersive VR when the user moving around in the tracking area, where a moveable display or a lightweight portable device is required, due to the user’s nose is the only human receptor for perceiving the scent. Therefore, in this work, our main concept is to provide a steerable mist-based olfaction display and an airflow guiding module mounted below the HMD, which allows the user to face different directions without carrying the weighty liquid. We present Lotus, a mist-based olfactory display with an airflow guiding module for simulating environments with olfactions which can provide two kinds of VEs simultaneously for enhancing the immersive experience.

References:


    [1] Judith Amores and Pattie Maes. 2017. Essence: Olfactory interfaces for unconscious influence of mood and cognitive performance. In ACM SIGCHI. 28–34.
    [2] Keisuke Hasegawa, Liwei Qiu, and Hiroyuki Shinoda. 2017. Interactive midair odor control via ultrasound-driven air flow. In ACM SIGGRAPH Asia Emerging Technologies. 8.
    [3] Kazuki Hashimoto and Takamichi Nakamoto. 2016. Tiny olfactory display using surface acoustic wave device and micropumps for wearable applications. IEEE Sensors Journal 16, 12 (2016), 4974–4980.
    [4] Haruka Matsukura, Tatsuhiro Yoneda, and Hiroshi Ishida. 2013. Smelling screen: development and evaluation of an olfactory display system for presenting a virtual odor source. IEEE TVCG 19, 4 (2013), 606–615.
    [5] Arito Mochizuki, Takashi Amada, Sayuri Sawa, Tadayuki Takeda, Shogo Motoyashiki, Kazuhiro Kohyama, Masataka Imura, and Kunihiro Chihara. 2004. Fragra: a visual-olfactory VR game. In ACM SIGGRAPH Sketches. 123.
    [6] Niall Murray, Brian Lee, Yuansong Qiao, and Gabriel-Miro Muntean. 2016. Olfaction-enhanced multimedia: A survey of application domains, displays, and research challenges. ACM CSUR 48, 4 (2016), 56.
    [7] Takamichi Nakamoto, Shigeki Otaguro, Masashi Kinoshita, Masahiko Nagahama, Keita Ohinishi, and Taro Ishida. 2008. Cooking up an interactive olfactory game display. IEEE Computer Graphics and Applications 28, 1 (2008).
    [8] Tomoya Yamada, Satoshi Yokoyama, Tomohiro Tanikawa, Koichi Hirota, and Michitaka Hirose. 2006. Wearable olfactory display: Using odor in outdoor environment. In IEEE Virtual Reality. 199–206.
    [9] Yasuyuki Yanagida, Shinjiro Kawato, Haruo Noma, Akira Tomono, and N Tesutani. 2004. Projection based olfactory display with nose tracking. In IEEE Virtual Reality. 43–50.


ACM Digital Library Publication:



Submit a story:

If you would like to submit a story about this experience or presentation, please contact us: historyarchives@siggraph.org