“Virtual Whiskers: Cheek Haptic-Based Spatial Directional Guidance in a Virtual Space” by Nakamura, Verhulst, Sakurada, Fukuoka and Sugimoto – ACM SIGGRAPH HISTORY ARCHIVES

“Virtual Whiskers: Cheek Haptic-Based Spatial Directional Guidance in a Virtual Space” by Nakamura, Verhulst, Sakurada, Fukuoka and Sugimoto

  • 2021 SA VR_Nakamura_Virtual Whiskers-Cheek Haptic-Based Guidance

Conference:


Experience Type(s):


Title:


    Virtual Whiskers: Cheek Haptic-Based Spatial Directional Guidance in a Virtual Space

Presenter(s):



Description:


    In spatial navigation, adding haptic cues to visual information let users understand the spatial information better. Most haptic devices stimulate various body parts, while few devices target our heads that are sensitive to mechanical stimuli. This paper presents Virtual Whiskers, a spatial directional guidance technique using cheek haptics in a virtual space. We created a cheek haptic stimulation device by attaching two tiny robot arms to an Head-Mounted Display. The robot arms trace the cheek with proximity sensors to estimate the cheek surface. Target azimuthal and elevational directions are translated into a point on the cheek surface. The robot arms touch the point to present target directional cues. We demonstrate our technique in two applications.

References:


    [1] Cristy Ho and Charles Spence. 2007. Head orientation biases tactile localization. Brain Research 1144(2007), 136–141.
    [2] Alexander Wilberz, Dominik Leschtschow, Christina Trepkowski, Jens Maiero, Ernst Kruijff, and Bernhard Riecke. 2020. FaceHaptics: Robot Arm Based Versatile Facial Haptics for Immersive Environments. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14.


ACM Digital Library Publication:


Overview Page:



Submit a story:

If you would like to submit a story about this experience or presentation, please contact us: historyarchives@siggraph.org