“Walking Balance Assessment With Eye-tracking and Spatial Data Visualization” by Wang, Lubetzky and Perlin

  • ©Zhu Wang, Anat Lubetzky, and Ken Perlin


Entry Number: 10


    Walking Balance Assessment With Eye-tracking and Spatial Data Visualization

Program Title:

    Immersive Medicine



    Virtual Reality (VR) based assessment systems can simulate diverse real-life scenarios and help clinicians assess participants’ performance under controlled functional contexts. Our previous work demonstrated an assessment paradigm to provide multi-sensory stimuli and cognitive load, and quantify walking balance with obstacle negotiation by motion capture and pressure sensing. However, we need to fill two gaps to make it more clinically relevant: 1. it required offline complex data processing with external statistical analysis software; 2. it utilized motion tracking but overlooked eye movement. Therefore, we present a novel walking balance assessment system with eye tracking to investigate the role of eye movement in walking balance and spatial data visualization to better interpret and understand the experimental data. The spatial visualization includes instantaneous in-situ VR replay for the gaze, head, and feet; and data plots for the outcome measures. The system fills a need to provide eye tracking and intuitive feedback in VR to experimenters, clinicians, and participants in real-time.


    1. William P Berg, Helaine M Alessio, Eugenia M Mills, and Chen Tong. 1997. Circumstances and consequences of falls in independent community-dwelling older adults. Age and ageing 26, 4 (1997), 261–268.
    2. Shlomit Eyal, Ilan Kurz, Anat Mirelman, Inbal Maidan, Nir Giladi, and Jeffrey M Hausdorff. 2020. Successful negotiation of anticipated and unanticipated obstacles in young and older adults: not all is as expected. Gerontology 66, 2 (2020), 187–196.
    3. Agostino Gibaldi, Mauricio Vanegas, Peter J Bex, and Guido Maiello. 2017. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research. Behavior research methods 49, 3 (2017), 923–946.
    4. Taylor R Hayes and Alexander A Petrov. 2016. Mapping and correcting the influence of gaze position on pupil size measurements. Behavior Research Methods 48, 2 (2016), 510–527.
    5. Eunice Jun, Jeanine K Stefanucci, Sarah H Creem-Regehr, Michael N Geuss, and William B Thompson. 2015. Big foot: Using the size of a virtual foot to scale gap width. ACM Transactions on Applied Perception (TAP) 12, 4 (2015), 1–12.
    6. Zhu Wang, Anat Lubetzky, Marta Gospodarek, Makan TaghaviDilamani, and Ken Perlin. 2019. Virtual Environments for Rehabilitation of Postural Control Dysfunction. arXiv preprint arXiv:1902.10223(2019).
    7. Zhu Wang, Anat Lubetzky, Charles Hendee, Marta Gospodarek, and Ken Perlin. 2020. A Virtual Obstacle Course within Diverse Sensory Environments. In ACM SIGGRAPH 2020 Immersive Pavilion. 1–2.

Additional Images:

©Zhu Wang, Anat Lubetzky, and Ken Perlin ©Zhu Wang, Anat Lubetzky, and Ken Perlin

ACM Digital Library Publication:

Overview Page: