“MRTouch: Adding Touch Input to Head-Mounted Mixed Reality” by Xiao, Schwarz, Throm, Wilson and Benko – ACM SIGGRAPH HISTORY ARCHIVES

“MRTouch: Adding Touch Input to Head-Mounted Mixed Reality” by Xiao, Schwarz, Throm, Wilson and Benko

  • ©

Conference:


Type(s):


Title:

    MRTouch: Adding Touch Input to Head-Mounted Mixed Reality

Session/Category Title:   IEEE TVCG Session on Virtual and Augmented Reality


Presenter(s)/Author(s):



Abstract:


    We present MRTouch, a novel multitouch input solution for head-mounted mixed reality systems. Our system enables users to reach out and directly manipulate virtual interfaces affixed to surfaces in their environment, as though they were touchscreens. Touch input offers precise, tactile and comfortable user input, and naturally complements existing popular modalities, such as voice and hand gesture. Our research prototype combines both depth and infrared camera streams together with real-time detection and tracking of surface planes to enable robust finger-tracking even when both the hand and head are in motion. Our technique is implemented on a commercial Microsoft HoloLens without requiring any additional hardware nor any user or environmental calibration. Through our performance evaluation, we demonstrate high input accuracy with an average positional error of 5.4 mm and 95% button size of 16 mm, across 17 participants, 2 surface orientations and 4 surface materials. Finally, we demonstrate the potential of our technique to enable on-world touch interactions through 5 example applications.

References:


    [1]
    M. Adcock, M. Hutchins, C. Gunn. “Haptic Collaboration with Augmented Reality,” ACM SIGGRAPH 2004 Posters (SIGGRAPH ’04), pp. 41, 2004.

    [2]
    B. Araujo, R. Jota, V. Perumal, J.X. Yao, K. Singh and D. Wigdor. “Snake Charmer: Physically Enabling Virtual Objects,” Proc.10th Int. Conf. Tangible, Embedded and Embodied Interaction (TEI ’16), pp. 218?226. 2016.

    [3]
    M. Azmandian, M. Hancock, H. Benko, E. Ofek and A.D. Wilson. “Haptic Retargeting: Dynamic Repurposing of Passive Haptics for Enhanced Virtual Reality Experiences,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI ’16), pp. 1968?1979, 2016.

    [4]
    M. B?ce, T. Lepp?nen, D.G. de Gomez and A.R. Gomez. “ubiGaze: ubiquitous augmented reality messaging using gaze gestures,” Proc. SIGGRAPH ASIA 2016 Mobile Graphics and Interactive Applications (SA ’16), Article 11, pp.5 pages, 2016.

    [5]
    H. Benko, C. Holz, M. Sinclair and E. Ofek. “NormalTouch and TextureTouch: High-fidelity 3D Haptic Shape Rendering on Handheld Virtual Reality Controllers,” Proc. 29th Ann. Symp. User Interface Software and Technology (UIST ’16), pp. 717?728, 2016.

    [6]
    H. Benko, E.W. Ishak and S. Feiner. “Cross-Dimensional Gestural Interaction Techniques for Hybrid Immersive Environments,” Proc. IEEE Virtual Reality (VR 2005), pp. 209?116, 2005.

    [7]
    M. Billinghurst, H. Kato and I. Poupyrev. “Collaboration with tangible augmented reality interfaces,” Proc. HCI Int., pp. 234?241, 2004.

    [8]
    R.A. Bolt. “Put-that-there: Voice and gesture at the graphics interface,” Proc. 7th Ann. Conf. Computer graphics and interactive techniques (SIGGRAPH ’80), pp. 262?270, 1980.

    [9]
    G. Burdea, J. Zhuang, E. Roskos, D. Silver and N. Langrana. “A portable dextrous master with force feedback,” Presence: Teleoper. Virtual Environ . Volume 1, Issue 1 (1992), pp. 18?28.

    [10]
    M.C. Cabral, C.H. Morimoto and M.K. Zuffo. “On the usability of gesture interfaces in virtual reality environments,” Proc. Lat. Am. Conf. Human-computer interaction (CLIHC ’05), pp. 100?108. 2005.

    [11]
    J. Canny. “A Computational Approach to Edge Detection,” IEEE Trans. Pattern Anal. Mach. Intell. Volume 8, Issue 6 (1986), pp. 679?698.

    [12]
    T. Carter, S.A. Seah, B. Long, B. Drinkwater and S. Subramanian. “UltraHaptics: multi-point mid-air haptic feedback for touch surfaces,” Proc. 26th Ann. Symp. User Interface Software and Technology (UIST ’13), pp. 505?514, 2013.

    [13]
    J.S. Chang, E.Y. Kim, KC Jung and H.J. Kim, “Real time hand tracking based on active contour model,” Proc. Int. Conf. Computational Science and Applications (ICCSA ’05), pp. 999?1006, 2005.

    [14]
    I. Chatterjee, R. Xiao and C. Harrison. “Gaze+Gesture: Expressive, Precise and Targeted Free-Space Interactions,” Proc. ACM Int. Conf. Multimodal Interaction (ICMI ’15), pp. 131?138, 2015.

    [15]
    X. Chen, J. Schwarz, C. Harrison, J. Mankoff and S.E. Hudson. “Air+touch: interweaving touch & in-air gestures,” Proc. 27th Ann. Symp. User Interface Software and Technology (UIST ’14), pp. 519?525, 2014.

    [16]
    K. Dorfmuller-Ulhaas and D. Schmalstieg. “Finger tracking for interaction in augmented environments,” Proc. IEEE and ACM Int. Symp. Augmented Reality, pp. 55?64, 2001.

    [17]
    M.A. Fischler and R.C. Bolles. “Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM Volume 24, Issue 6 (1981), pp. 381?395.

    [18]
    T.B. Fitzpatrick. “Soleil et peau,” J Med Esthet Volume 2 . Issue 7 (1975): pp. 33?34.

    [19]
    J. Gugenheimer, D. Dobbelstein, C. Winkler, G. Haas and E. Rukzio. “FaceTouch: Enabling Touch Interaction in Display Fixed UIs for Mobile Virtual Reality,” Proc. 29th Ann. Symp. User Interface Software and Technology (UIST ’16), pp. 49?60, 2016.

    [20]
    M. Hachet, B. Bossavit, A. Coh? and J-B de la Rivi?re. “Toucheo: multitouch and stereo combined in a seamless workspace,” Proc. 24th Ann. Symp. User Interface Software and Technology (UIST ’11), pp. 587?592, 2011.

    [21]
    J.Y. Han. “Low-cost multi-touch sensing through frustrated total internal reflection,” Proc.18th Ann. Symp. User Interface Software and Technology (UIST ’05), pp. 115?118, 2005.

    [22]
    C. Harrison, H. Benko and A.D. Wilson. “OmniTouch: wearable multitouch interaction everywhere,” Proc. 24th Ann. Symp. User Interface Software and Technologv (UIST’ 11), pp. 441?450, 2011.

    [23]
    A. Hettiarachchi and D. Wigdor. “Annexing Reality: Enabling Opportunistic Use of Everyday Objects as Tangible Proxies in Augmented Reality,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI ’16), pp. 1957?1967, 2016.

    [24]
    J.D. Hincapi?-Ramos, X. Guo, P. Moghadasian and P. Irani. “Consumed endurance: a metric to quantify arm fatigue of mid-air interactions,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI ’14), pp. 1063?1072, 2014.

    [25]
    K. Hinckley, S. Heo, M. Pahud, C. Holz, H. Benko, A. Sellen, R. Banks, K. O’Hara, G. Smyth and W. Buxton. “Pre-Touch Sensing for Mobile Interaction,” Proc. 2016 SIGCHI Conf. Human Factors in Computing Systems (CHI ’16), pp. 2869?2881, 2016.

    [26]
    C. Holz and P. Baudisch. “The generalized perceived input point model and how to double touch accuracy by extracting fingerprints,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI ’10), pp. 581?590, 2010.

    [27]
    . HTC Vive Controller . https://www.vive.com/us/accessory/controller/.

    [28]
    W. H?rst and C. van Wezel. “Gesture-based Interaction via Finger Tracking for Mobile Augmented Reality,” Multimedia Tools and Applications Volume 62, Issue 1 (), pp. 233?258.

    [29]
    H. Ishii and B. Ullmer “Tangible bits: towards seamless interfaces between people, bits and atoms,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI ’97), pp. 234?241.

    [30]
    S. Izadi, D. Kim, O. Hilliges, D. Molyneaux, R. Newcombe, P. Kohli, J. Shotton, S. Hodges, D. Freeman, A. Davison and A. Fitzgibbon. “KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera,” Proc. 24th Ann. Symp. User Interface Software and Technology (UIST ’11), pp. 559?568, 2011.

    [31]
    R. Jota, A. Ng, P. Dietz and D. Wigdor. “How fast is fast enough?: a study of the effects of latency in direct-touch pointing tasks,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI ’13), pp. 2291?2300, 2013.

    [32]
    H. Koike, Y. Sato and Y. Kobayashi. “Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system,” ACM Trans. Comput.-Hum. Interact . Volume 8, Issue 4 (2001), pp. 307?322.

    [33]
    Oliver Kreylos. Vrui VR Toolkit . http.//idav.ucdavis.edu/~okrey-los/ResDev/Vrui.

    [34]
    Leap Motion Mobile VR Platform . https://www.leapmotion.com/product/vr.

    [35]
    SK Lee, W. Buxton and K.C. Smith. “A multi-touch three dimensional touch-sensitive tablet,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI ’85), pp. 21?25, 1985.

    [36]
    T. Lee and T. Hollerer. “Handy AR: Markerless Inspection of Augmented Reality Objects Using Fingertip Tracking,” Proc. 11th IEEE Int. Symp. Wearable Computers (ISWC ’07), pp. 1?8, 2007.

    [37]
    J. Letessier and F. B?rard. “Visual tracking of bare fingers for interactive surfaces,” Proc. 17th Ann. Symp. User Interface Software and Technology (UIST ’04), pp. 119?122, 2004.

    [38]
    R.W. Lindeman, R. Page, Y. Yanagida and J.L. Sibert. “Towards full-body haptic feedback: the design and deployment of a spatialized vibrotactile feedback system,” Proc. ACM Symp. Virtual Reality Software and Technology (VRST ’04), pp. 146?149. 2004.

    [39]
    W.R. Mark, L. McMillan, G. Bishop. “Post-Rendering 3D Warping,” Proc. Symp. on Interactive 3D Graphics (I3D ’97), pp. 7?16. 1997.

    [40]
    T.H. Massie and J.K. Salisbury. “The PHANToM Haptic Interface: A Device for Probing Virtual Objects,” ASME Winter Annual Meeting, DSC -Vol. Volume 55?1, pp. 295?300, 1994.

    [41]
    D. Medeiros, L. Teixeira, F. Carvalho, I. Santos and A. Raposo. “A tablet-based 3D interaction tool for virtual engineering environments,” Proc.12th ACM SIGGRAPH Int. Conf. on Virtual-Reality Continuum and Its Applications in Industry (VRCAI ’13), pp. 211?218. 2013.

    [42]
    . Kinect hardware . https://developer.microsoft.com/en-us/windows/kinect/hardware.

    [43]
    . Microsoft HoloLens . https://www.microsoft.com/microsoft-hololens/.

    [44]
    . Mixed Reality-Voice Input . https://developer.microsoft.com/windows/mixed-reality/voice_input.

    [45]
    . Use the HoloLens clicker . https://support.microsoft.com/help/12646.

    [46]
    . Use your Xbox Wireless Controller on Samsung Gear VR . https://support.xbox.com/en-US/xbox-one/accessories/use-samsung-gear-vr-with-xbox-controller.

    [47]
    A. Ng, J. Lepinski, D. Wigdor, S. Sanders and P. Diet. “Designing for low-latency direct-touch input,” Proc. 25th Ann. Symp. User Interface Software and Technology (UIST ’12), pp. 453?464, 2012.

    [48]
    Oculus VR . “Asynchronous TimeWarp (ATW) .” https://developer.oculus.com/documentation/mobilesdk/latest/concepts/mobile-time-warp-overview/.

    [49]
    J.A. Paradiso, K. Hsiao, J. Strickon, J. Lifton and A. Adler. “Sensor systems for interactive surfaces,” IBM Syst. J. Volume 39, Issue 3?4 (2000), pp. 892?914.

    [50]
    J.A. Paradiso, C.K. Leo, N. Checka, and K. Hsiao. “Passive acoustic sensing for tracking knocks atop large interactive displays,” Proc. IEEE Sensors ’02, pp. 521?527. 2002.

    [51]
    H.M. Park, S.H. Lee and J.S. Choi. “Wearable augmented reality system using gaze interaction,” Proc. IEEE/ACM Int. Symp. Mixed and Augmented Reality (ISMAR’08), pp. 175?176, 2008.

    [52]
    E.N. Saba, E.C. Larson and S.N. Patel. “Dante vision: In-air and touch gesture sensing for natural surface interaction with combined depth and thermal cameras,” Proc. IEEE Emerging Signal Processing Applications (ESPA ’12), pp. 167?170, 2012.

    [53]
    . Gear VR: About the Touchpad . http://www.samsung.com/au/support/skp/faq/1073201.

    [54]
    R. Sodhi, I. Poupyrev, M. Glisson and A. Israr. “AIREAL: interactive tactile experiences in free air,” ACM Trans. Graph. Volume 32, Issue 4, Article 134 (2013), pp.10 pages.

    [55]
    J.A. Walsh, S. von Itzstein and B.H. Thomas. “Ephemeral Interaction using Everyday Objects,” Proc. 15th Australasian User Interface Conference (AUIC ’14), pp. 29?37, 2014.

    [56]
    D. Wei, S.Z. Zhou and D. Xie. “MTMR: A conceptual interior design framework integrating Mixed Reality with the Multi-Touch tabletop interface,” Proc. IEEE Int. Symp. Mixed and Augmented Reality (ISMAR’10), pp. 279?280, 2010.

    [57]
    A.D. Wilson. “Depth sensing video cameras for 3D tangible tabletop interaction,” Proc. 2nd Int. Wkshp. Horizontal Interactive Human-Computer Systems (Tabletop ’07), pp. 201?204, 2007.

    [58]
    R. Xiao, C. Harrison and S.E. Hudson. “WorldKit: rapid and easy creation of ad-hoc interactive applications on everyday surfaces,” Proc. SIGCHI Conf. Human Factors in Computing Systems (CHI ’13), pp. 879?888. 2013.

    [59]
    R. Xiao, S.E. Hudson and C. Harrison. “DIRECT: Making Touch Tracking on Ordinary Surfaces Practical with Hybrid Depth-Infrared Sensing,” Proc. 2016 ACM Int. Conf. Interactive Surfaces and Spaces (ISS ’16), pp. 85?94, 2016.

    [60]
    R. Xiao, G. Lew, J. Marsanico, D. Hariharan, S.E. Hudson and C. Harrison. “Toffee: enabling ad hoc, around-device interaction with acoustic time-of-arrival correlation,” Proc. 16th Int. Conf. Human-computer interaction with mobile devices & services (MobileHCI 14), pp. 67?76, 2014.

    [61]
    F. Zhou, H.B-L. Duh and M. Billinghurst. “Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR,” Proc. 7th IEEE/ACM Int. Symp. Mixed and Augmented Reality (ISMAR’08), pp. 193?202, 2008.


ACM Digital Library Publication:



Overview Page:



Submit a story:

If you would like to submit a story about this presentation, please contact us: historyarchives@siggraph.org