“Head Gaze Target Selection for Redirected Interaction” by Matthews and Smith – ACM SIGGRAPH HISTORY ARCHIVES

“Head Gaze Target Selection for Redirected Interaction” by Matthews and Smith

  • 2019 SA VR_Matthews_Head Gaze Target Selection for Redirected Interaction

Conference:


Experience Type(s):


Title:


    Head Gaze Target Selection for Redirected Interaction

Presenter(s):



Description:


    Haptic interaction in virtual reality poses an ongoing research challenge as to how touch sensations are delivered in applications. A simple solution is to have matching physical and virtual counterparts, i.e. a physical switch provides perfect haptic feedback for a virtual reality switch with matching geometry. Redirection illusions take this further allowing many virtual objects to be mapped to one physical object. In many systems that utilise redirection the interaction sequence is predetermined. This prevents users selecting their own target and a reset action is also required to provide an origin for the redirection. This paper overcomes these limitations with a novel application of head gaze to enable users to determine their own sequence of interactions with a remapped physical-virtual interface. We also introduce a technique for providing an optimal mapping between physical and virtual components using multiple physical target to remove the reset action.

References:


    [1] Mahdi Azmandian, Mark Hancock, Hrvoje Benko, Eyal Ofek, and Andrew D. Wilson. 2016. Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems – CHI ’16. ACM Press, 1968–1979. https://doi.org/10.1145/2858036.2858226
    [2] Lung-Pan Cheng, Eyal Ofek, Christian Holz, Hrvoje Benko, and Andrew D. Wilson. 2017. Sparse Haptic Proxy: Touch Feedback in Virtual Environments Using a General Passive Prop. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems – CHI ’17. ACM Press, 3718–3728. https://doi.org/10.1145/3025453.3025753
    [3] Robert J. K. Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’90). ACM, New York, NY, USA, 11–18. https://doi.org/10.1145/97243.97246 event-place: Seattle, Washington, USA.
    [4] Luv Kohli. 2013. Redirected Touching. Ph.D. Dissertation. University of North Carolina at Chapel Hill. https://dl.acm.org/citation.cfm?id=2519692
    [5] B. J. Matthews, B. H. Thomas, S. Von Itzstein, and R. T. Smith. 2019. Remapped Physical-Virtual Interfaces with Bimanual Haptic Retargeting. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 19–27. https://doi.org/10.1109/VR.2019.8797974
    [6] Yuan Yuan Qian and Robert J. Teather. 2017. The Eyes Don’t Have It: An Empirical Comparison of Head-based and Eye-based Selection in Virtual Reality. In Proceedings of the 5th Symposium on Spatial User Interaction(SUI ’17). ACM, 91–98. https://doi.org/10.1145/3131277.3132182
    [7] A. Zenner and A. Krüqer. 2019. Estimating Detection Thresholds for Desktop-Scale Hand Redirection in Virtual Reality. In 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). 47–55. https://doi.org/10.1109/VR.2019.8798143


ACM Digital Library Publication:


Overview Page:



Submit a story:

If you would like to submit a story about this experience or presentation, please contact us: historyarchives@siggraph.org