“A Brain-computer Interface for Extended Reality Interfaces” by Jantz, Molnar and Alcaide

  • ©

Conference:


Type(s):


Entry Number: 03

Title:


    A Brain-computer Interface for Extended Reality Interfaces

Presenter(s):



Description:


    Extended reality (XR) technologies, such as augmented reality (AR) and virtual reality (VR), remain limited in their interaction modalities. Prevailing interaction methods such as hand gestures and voice recognition prove awkward in XR environments, even when performing common tasks (e.g., object selection, menu navigation, and others). In contrast, an ideal interaction method would robustly and naturally translate a user’s intention into both 2D and 3D environmental controls. A direct brain-computer interface (BCI) system is ideally situated to accomplish this. Neurable’s technology provides a solution to maximize XR’s potential, affording users real-time mental selection via dry electroencephalography (EEG).

References:


    1. Amazon.com. 2017. Alexa. (2017). https://developer.amazon.com/alexa
    2. Nicola Bizzotto, Alessandro Costanzo, Leonardo Bizzotto, Dario Regis, Andrea Sandri, and Bruno Magnan. 2014. Leap Motion Gesture Control With OsiriX in the Operating Room to Control Imaging. Surgical Innovation 21, 6 (2014), 655–656.
    3. Facebook. 2017. Oculus Rift. (2017). https://www.oculus.com/
    4. FOVE. 2017. FOVE One. (2017). https://www.getfove.com/
    5. Chris Grayson. 2016. Waveguides + Eye-Tracking + EEG. giganti.co (2016). http://www.giganti.co/DigiLens
    6. Frank Honold, Pascal Bercher, Felix Richter, Florian Nothdurft, Thomas Geier, Roland Barth, Thilo Hörnle, Felix Schüssel, Stephan Reuter, and Matthias Rau. 2014. Companion-technology: towards user-and situation-adaptive functionality of technical systems. Intelligent Environments (IE), 2014 International Conference on (2014), 378–381. 
    7. HTC. 2017. Vive. (2017). https://www.vive.com/us/
    8. Robert J K Jacob. 1995. Eye Tracking in Advanced Interface Design. Virtual Environments and Advanced Interface Design (1995), 258–290. 
    9. Robert J K Jacob and Keith S. Karn. 2003. Eye Tracking in Human-Computer Interaction and Usability Research. Ready to Deliver the Promises. The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research (2003), 531–553. arXiv:15334406
    10. Microsoft. 2017. Hololens. (2017). https://www.microsoft.com/en-us/hololens
    11. Reinhold Scherer, Mike Chung, Johnathan Lyon, Willy Cheung, and Rajesh P. N. Rao. 2010. Interaction with Virtual and Augmented Reality Environments using Non-Invasive Brain-Computer Interfacing. 2010 1st International Conference on Apllied Bionics and Biomechanics (ICABB-2010), Venice, Italy, October 14–16, 2010 (2010).
    12. Piotr Stawicki, Felix Gembler, Aya Rezeika, and Ivan Volosyak. 2017. A novel hybrid mental spelling application based on eye tracking and SSVEP-based BCI. Brain Sciences 7, 4 (2017).
    13. Chih-Hsiang Yu, Wen-Wei Peng, Shys-Fan Yang-Mao, Yuan Wang, Winyu Chinthammit, and Henry Been-Lirn Duh. 2015. A hand gesture control framework on smart glasses. SIGGRAPH ASIA 2015 Mobile Graphics and Interactive Applications on – SA ’15 (2015), 1–1. arXiv:cs/9605103 

ACM Digital Library Publication:



Overview Page: