“ShapeSonic: Sonifying Fingertip Interactions for Non-Visual Virtual Shape Perception” by Huang, Hanocka, Siu and Gingold
Conference:
Type(s):
Title:
- ShapeSonic: Sonifying Fingertip Interactions for Non-Visual Virtual Shape Perception
Session/Category Title: Multidisciplinary Fusion
Presenter(s)/Author(s):
Abstract:
For sighted users, computer graphics and virtual reality allow them to model and perceive imaginary objects and worlds. However, these approaches are inaccessible to blind and visually impaired (BVI) users, since they primarily rely on visual feedback. To this end, we introduce ShapeSonic, a system designed to convey vivid 3D shape perception using purely audio feedback or sonification. ShapeSonic tracks users’ fingertips in 3D and provides real-time sound feedback (sonification). The shape’s geometry and sharp features (edges and corners) are expressed as sounds whose volumes modulate according to fingertip distance. ShapeSonic is based on a mass-produced, commodity hardware platform (Oculus Quest). In a study with 15 sighted and 6 BVI users, we demonstrate the value of ShapeSonic in shape landmark localization and recognition. ShapeSonic users were able to quickly and relatively accurately “touch” points on virtual 3D shapes in the air.
References:
[1]
Miguel A. Alonso-Arevalo, Simon Shelley, Dik Hermes, Jacqueline Hollowood, Michael Pettitt, Sarah Sharples, and Armin Kohlrausch. 2012. Curve shape and curvature perception through interactive sonification. ACM Transactions on Applied Perception 9, 4 (Oct. 2012), 17:1–17:19. https://doi.org/10.1145/2355598.2355600
[2]
Ronny Andrade, Steven Baker, Jenny Waycott, and Frank Vetere. 2018. Echo-house: exploring a virtual environment by using echolocation. In Proceedings of the 30th Australian Conference on Computer-Human Interaction. ACM, Melbourne Australia, 278–289. https://doi.org/10.1145/3292147.3292163
[3]
Ronny Andrade, Jenny Waycott, Steven Baker, and Frank Vetere. 2021. Echolocation as a Means for People with Visual Impairment (PVI) to Acquire Spatial Knowledge of Virtual Space. ACM Transactions on Accessible Computing 14, 1 (March 2021), 1–25. https://doi.org/10.1145/3448273
[4]
Olivier Bau, Ivan Poupyrev, Ali Israr, and Chris Harrison. 2010. TeslaTouch: electrovibration for touch surfaces. In Proceedings of the 23nd annual ACM symposium on User interface software and technology. 283–292.
[5]
Hrvoje Benko, Christian Holz, Mike Sinclair, and Eyal Ofek. 2016. Normaltouch and texturetouch: High-fidelity 3d haptic shape rendering on handheld virtual reality controllers. In Proceedings of the 29th annual symposium on user interface software and technology. 717–728.
[6]
Jens Bornschein, Denise Prescher, and Gerhard Weber. 2015. Collaborative creation of digital tactile graphics. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility. 117–126.
[7]
S Brewster, L Brown, R Ramloll, and W Yu. 2002. Browsing Modes For Exploring Sonified Line Graphs. In Proc. of the 16th British HCI Conference London, Vol. 2. 2–5.
[8]
Evan S. Dellon, Robin Mourey, and A. Lee Dellon. 1992. Human Pressure Perception Values for Constant and Moving One- and Two-Point Discrimination. Plastic and Reconstructive Surgery 90, 1 (July 1992), 112. https://journals.lww.com/plasreconsurg/Citation/1992/07000/Human_Pressure_Perception_Values_for_Constant_and.17.aspx
[9]
Diana Deutsch. 2013. Absolute pitch. In The Psychology of Music (Third Edition). Elsevier Academic Press, San Diego, CA, US, 141–182. https://doi.org/10.1016/B978-0-12-381460-9.00005-5
[10]
Cathy Fang, Yang Zhang, Matthew Dworman, and Chris Harrison. 2020. Wireality: Enabling Complex Tangible Geometries in Virtual Reality with Worn Multi-String Haptics. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, Honolulu HI USA, 1–10. https://doi.org/10.1145/3313831.3376470
[11]
Rocco Furferi, Lapo Governi, Yary Volpe, Luca Puggelli, Niccolò Vanni, and Monica Carfagni. 2014. From 2D to 2.5 D ie from painting to tactile model. Graphical Models 76, 6 (2014), 706–723.
[12]
Giovanni Fusco and Valerie S Morash. 2015. The tactile graphics helper: providing audio clarification for tactile graphics using machine vision. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility. 97–106.
[13]
Andrea Gerino, Lorenzo Picinali, Cristian Bernareggi, Nicolò Alabastro, and Sergio Mascetti. 2015. Towards large scale evaluation of novel sonification techniques for non visual shape exploration. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility. 13–21.
[14]
Nicholas A. Giudice, Hari Prasath Palani, Eric Brenner, and Kevin M. Kramer. 2012. Learning non-visual graphical information using a touch-based vibro-audio interface. In Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility – ASSETS ’12. ACM Press, Boulder, Colorado, USA, 103. https://doi.org/10.1145/2384916.2384935
[15]
Tobias Heed, Johanna Möller, and Brigitte Röder. 2015. Movement induces the use of external spatial coordinates for tactile localization in congenitally blind humans. Multisensory research 28, 1-2 (2015), 173–194.
[16]
Thomas Hermann. 2002. Sonification for exploratory data analysis. Ph. D. Dissertation.
[17]
Leona M Holloway, Cagatay Goncu, Alon Ilsar, Matthew Butler, and Kim Marriott. 2022. Infosonics: Accessible Infographics for People who are Blind using Sonification and Voice. In CHI Conference on Human Factors in Computing Systems. ACM, New Orleans LA USA, 1–13. https://doi.org/10.1145/3491102.3517465
[18]
Alec Jacobson, Daniele Panozzo, 2018. libigl: A simple C++ geometry processing library. https://libigl.github.io/.
[19]
Gunnar Jansson, Massimo Bergamasco, and Antonio Frisoli. 2003. A new option for the visually impaired to experience 3D art at museums: manual exploration of virtual copies. Visual Impairment Research 5, 1 (2003), 1–12.
[20]
Caroline Karbowski. 2020. See3D: 3D Printing for People Who Are Blind. Journal of Science Education for Students with Disabilities 23, 1 (Feb. 2020). https://doi.org/10.14448/jsesd.12.0006
[21]
Roberta L. Klatzky, Susan J. Lederman, and Victoria A. Metzger. 1985. Identifying objects by touch: An “expert system”. Perception & Psychophysics 37, 4 (July 1985), 299–302. https://doi.org/10.3758/BF03211351
[22]
Manfred Lau, Kapil Dev, Julie Dorsey, and Holly Rushmeier. 2018. A Human-Perceived Softness Measure of Virtual 3D Objects. ACM Transactions on Applied Perception 0, 0 (2018).
[23]
Nan Li, Zheshen Wang, Jesus Yuriar, and Baoxin Li. 2011. Tactileface: A system for enabling access to face photos by visually-impaired people. In Proceedings of the 16th international conference on Intelligent user interfaces. 445–446.
[24]
John C. Middlebrooks. 2015. Sound localization. In Handbook of Clinical Neurology. Vol. 129. Elsevier, 99–116. https://doi.org/10.1016/B978-0-444-62630-1.00006-8
[25]
Jennifer L Milne, Melvyn A Goodale, and Lore Thaler. 2014. The role of head movements in the discrimination of 2-D shape by blind echolocation experts. Attention, Perception, & Psychophysics 76 (2014), 1828–1837.
[26]
Martez Mott, Edward Cutrell, Mar Gonzalez Franco, Christian Holz, Eyal Ofek, Richard Stoakley, and Meredith Ringel Morris. 2019. Accessible by design: An opportunity for virtual reality. In 2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). IEEE, 451–454.
[27]
Liam J. Norman, Caitlin Dodsworth, Denise Foresteire, and Lore Thaler. 2021. Human click-based echolocation: Effects of blindness and age, and real-life implications in a 10-week training program. PLOS ONE 16, 6 (June 2021), e0252330. https://doi.org/10.1371/journal.pone.0252330 Publisher: Public Library of Science.
[28]
Adriana Olmos and Jeremy R Cooperstock. 2012. MAKING SCULPTURES AUDIBLE THROUGH PARTICIPATORY SOUND DESIGN. (2012).
[29]
Sile O’Modhrain, Nicholas A Giudice, John A Gardner, and Gordon E Legge. 2015. Designing media for visually-impaired users of refreshable touch displays: Possibilities and pitfalls. IEEE transactions on haptics 8, 3 (2015), 248–257.
[30]
Athina Panotopoulou, Xiaoting Zhang, Tammy Qiu, Xing-Dong Yang, and Emily Whiting. 2020. Tactile line drawings for improved shape understanding in blind and visually impaired users. ACM Transactions on Graphics 39, 4 (Aug. 2020). https://doi.org/10.1145/3386569.3392388
[31]
Benjamin J Peters. 2011. Design and fabrication of a digitally reconfigurable surface. Ph. D. Dissertation. Massachusetts Institute of Technology.
[32]
Ather Sharif, Olivia H. Wang, and Alida T. Muongchan. 2022. “What Makes Sonification User-Friendly?” Exploring Usability and User-Friendliness of Sonified Responses. In The 24th International ACM SIGACCESS Conference on Computers and Accessibility. ACM, Athens Greece, 1–5. https://doi.org/10.1145/3517428.3550360
[33]
Lei Shi, Holly Lawson, Zhuohao Zhang, and Shiri Azenkot. 2019. Designing Interactive 3D Printed Models with Teachers of the Visually Impaired. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, Glasgow Scotland Uk, 1–14. https://doi.org/10.1145/3290605.3300427
[34]
Lei Shi, Yuhang Zhao, and Shiri Azenkot. 2017. Designing Interactions for 3D Printed Models with Blind People. In Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility(ASSETS ’17). Association for Computing Machinery, New York, NY, USA, 200–209. https://doi.org/10.1145/3132525.3132549
[35]
Mike Sinclair, Eyal Ofek, Mar Gonzalez-Franco, and Christian Holz. 2019. CapstanCrunch: A Haptic VR Controller with User-supplied Force Feedback. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology(UIST ’19). Association for Computing Machinery, New York, NY, USA, 815–829. https://doi.org/10.1145/3332165.3347891
[36]
Alexa Siu, Gene SH Kim, Sile O’Modhrain, and Sean Follmer. 2022. Supporting Accessible Data Visualization Through Audio Data Narratives. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. 1–19.
[37]
Alexa F. Siu, Son Kim, Joshua A. Miele, and Sean Follmer. 2019. shapeCAD: An Accessible 3D Modelling Workflow for the Blind and Visually-Impaired Via 2.5D Shape Displays. In The 21st International ACM SIGACCESS Conference on Computers and Accessibility(ASSETS ’19). Association for Computing Machinery, Pittsburgh, PA, USA, 342–354. https://doi.org/10.1145/3308561.3353782
[38]
Abigale Stangl, Chia-Lo Hsu, and Tom Yeh. 2015. Transcribing across the senses: community efforts to create 3D printable accessible tactile pictures for young children with visual impairments. In Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility. 127–137.
[39]
Jing Su, Alyssa Rosenzweig, Ashvin Goel, Eyal de Lara, and Khai N Truong. 2010. Timbremap: enabling the visually-impaired to use maps on touch-enabled devices. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services. 17–26.
[40]
Ivan E Sutherland. 1965. The Ultimate Display. In Proceedings of the IFIP Congress, Vol. 2. New York, 506–508.
[41]
Lore Thaler and Melvyn A. Goodale. 2016. Echolocation in humans: an overview. WIREs Cognitive Science 7, 6 (2016), 382–393. https://doi.org/10.1002/wcs.1408 _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/wcs.1408.
[42]
Chelsea Tymms, Esther P. Gardner, and Denis Zorin. 2018. A Quantitative Perceptual Model for Tactile Roughness. ACM Transactions on Graphics 37, 5 (Oct. 2018), 1–14. https://doi.org/10.1145/3186267
[43]
Bruce N. Walker. 2002. Magnitude estimation of conceptual data dimensions for use in sonification.Journal of Experimental Psychology: Applied 8, 4 (2002), 211–221. https://doi.org/10.1037/1076-898X.8.4.211
[44]
Bruce N. Walker. 2007. Consistency of magnitude estimations with conceptual data dimensions used for sonification. Applied Cognitive Psychology 21, 5 (2007), 579–599. https://doi.org/10.1002/acp.1291 _eprint: https://onlinelibrary.wiley.com/doi/pdf/10.1002/acp.1291.
[45]
Bruce N. Walker and Gregory Kramer. 2005. Mappings and metaphors in auditory displays: An experimental assessment. ACM Transactions on Applied Perception 2, 4 (Oct. 2005), 407–412. https://doi.org/10.1145/1101530.1101534
[46]
Ludwig Wallmeier and Lutz Wiegrebe. 2014. Self-motion facilitates echo-acoustic orientation in humans. Royal Society Open Science 1, 3 (2014), 140185.
[47]
R. Wang, C. Jung, and Y. Kim. 2022. Seeing Through Sounds: Mapping Auditory Dimensions to Data and Charts for People with Visual Impairments. Computer Graphics Forum 41, 3 (June 2022), 71–83. https://doi.org/10.1111/cgf.14523
[48]
Craig C. Wier, Walt Jesteadt, and David M. Green. 1977. Frequency discrimination as a function of frequency and sensation level. The Journal of the Acoustical Society of America 61, 1 (Jan. 1977), 178–184. https://doi.org/10.1121/1.381251
[49]
Cheng Xu, Ali Israr, Ivan Poupyrev, Olivier Bau, and Chris Harrison. 2011. Tactile display for the visually impaired using TeslaTouch. In CHI’11 Extended Abstracts on Human Factors in Computing Systems. 317–322.
[50]
Vibol Yem, Ryuta Okazaki, and Hiroyuki Kajimoto. 2016. FinGAR: combination of electrical and mechanical stimulation for high-fidelity tactile presentation. In ACM SIGGRAPH 2016 Emerging Technologies. 1–2.
[51]
Tsubasa Yoshida, Kris M Kitani, Hideki Koike, Serge Belongie, and Kevin Schlei. 2011. EdgeSonic: image feature sonification for the visually impaired. In Proceedings of the 2nd Augmented Human International Conference. 1–4.
[52]
Fan Zhang, Valentin Bazarevsky, Andrey Vakunov, Andrei Tkachenka, George Sung, Chuo-Ling Chang, and Matthias Grundmann. 2020. MediaPipe Hands: On-device Real-time Hand Tracking. https://doi.org/10.48550/arXiv.2006.10214 arXiv:2006.10214 [cs].
[53]
Haixia Zhao. 2006. Interactive sonification of abstract data: framework, design space, evaluation, and user tool. University of Maryland, College Park.
[54]
Haixia Zhao, Catherine Plaisant, and Ben Shneiderman. 2005. iSonic: interactive sonification for non-visual data exploration. In Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility. 194–195.


