“Nachtalb: A multisensory Neurofeedback VR-Interface” by Morat, Schwerdtfeger and Heidmann

  • ©Paul Morat, Aaron Schwerdtfeger, and Frank Heidmann

Conference:


Entry Number: 14

Title:


    Nachtalb: A multisensory Neurofeedback VR-Interface

Program Title:


    Immersive Pavilion

Presenter(s):



Description:


    Nachtalb is an immersive interface that enables brain-to-brain interaction using multisensory feedback. With the help of the g.tec Unicorn Hybrid Black brain-computer-interface (BCI), brain-activity-data is measured and translated visually with the Oculus Quest 2, tactilely with the bHaptics TactSuit and auditorily with 3D Sound. This intends to create a feedback loop that turns brain activity from data-input into sensory output which directly influences the brain activity data-input again.

References:


    1. Emanuele Argento, George Papagiannakis, Eva Baka, Michail Maniadakis, Panos Trahanias, Michael Sfakianakis, and Ioannis Nestoros. 2017. Augmented Cognition via Brainwave Entrainment in Virtual Reality: An Open, Integrated Brain Augmentation in a Neuroscience System Approach. Augmented Human Research 2 (2017), 1–14.
    2. Christian Breitwieser, Vera Kaiser, Christa Neuper, and Gernot R Müller-Putz. 2012. Stability and distribution of steady-state somatosensory evoked potentials elicited by vibro-tactile stimulation. Medical and Biological Engineering and Computing 50, 4 (2012), 347–357.
    3. Zitong Chen, Jing Liao, Jianqiao Chen, Chuyi Zhou, Fangbing Chai, Yang Wu, and Preben Hansen. 2021. Paint with Your Mind: Designing EEG-Based Interactive Installation for Traditional Chinese Artworks. In Proceedings of the Fifteenth International Conference on Tangible, Embedded, and Embodied Interaction. Association for Computing Machinery, New York, NY, USA.
    4. Judith Amores Fernandez, Anna Fusté, Robert Richer, and Pattie Maes. 2019. Deep Reality: An Underwater VR Experience to Promote Relaxation by Unconscious HR, EDA, and Brain Activity Biofeedback. In ACM SIGGRAPH 2019 Virtual, Augmented, and Mixed Reality (Los Angeles, California) (SIGGRAPH ’19). Association for Computing Machinery, New York, NY, USA.
    5. Yisi Liu, Olga Sourina, and Minh Khoa Nguyen. 2010. Real-Time EEG-Based Human Emotion Recognition and Visualization. In 2010 International Conference on Cyberworlds. IEEE Press, Piscataway, NJ, 262–269.
    6. Paul L. Nunez and Ramesh Srinivasan. 2006. Electric Fields of the Brain: The neurophysics of EEG. Oxford University Press, Oxford, UK.
    7. Marc Parenthoen, Fred Murie, and Flavien Thery. 2015. The Sea is Your Mirror. In Proceedings of the 8th ACM SIGGRAPH Conference on Motion in Games (Paris, France) (MIG ’15). Association for Computing Machinery, New York, NY, USA, 159–165.
    8. Roland Sigrist, Georg Rauter, Robert Riener, and Peter Wolf. 2013. Augmented visual, auditory, haptic, and multimodal feedback in motor learning: a review. Psychonomic Bulletin and Review 20, 1 (2013), 21–53.
    9. Ranganatha Sitaram and Tomas Ros. 2017. Closed-loop brain training: the science of neurofeedback. Nature reviews. Neuroscience 18, 2 (2017), 86–100.
    10. Tom De Smedt and Lieven Menschaert. 2012. VALENCE: affective visualisation using EEG. Digital Creativity 23, 3-4 (2012), 272–277.
    11. Nina Sobell. 2002. Streaming the Brain. IEEE MultiMedia 9, 3 (2002), 4–9.
    12. Eleftherios Triantafyllidis, Christopher Mcgreavy, Jiacheng Gu, and Zhibin Li. 2020. Study of Multimodal Interfaces and the Improvements on Teleoperation. IEEE Access 8(2020), 78213–78227.
    13. Bram van de Laar, Hayrettin Gürkök, Danny Plass-Oude Bos, Mannes Poel, and Anton Nijholt. 2013. Experiencing BCI Control in a Popular Computer Game. IEEE Transactions on Computational Intelligence and AI in Games 5, 2(2013), 176–184.

ACM Digital Library Publication:



Overview Page:


Type: