“Sights, Sounds, and Sensors: Where Visualization, Sonification, MEMS, HMDs, and 3D Converge” Moderated by

  • ©Cynthia Traeger, Mark R. Mine, Chris Mayhew, Yuval Boger, and Becky Oh

Conference:


Type:


Entry Number: 03

Title:

    Sights, Sounds, and Sensors: Where Visualization, Sonification, MEMS, HMDs, and 3D Converge

Presenter(s)/Author(s):



Abstract:


    The confluence of animation, parallel visualization, augmented and virtual reality, MEMS and sensor fusion, 3D visualization, head-mounted displays, autostereo devices, and sonification technologies offers rich possibilities for unique user experiences, but marketing hype is more confusing than clarifying. In this panel, five experts discuss the challenges of visualizing this information across mobile and other devices; current and future uses for visualization, collaboration, sonification, interactive, and autostereo technologies; and how sensors, MEMS, and wearables may play a role. Topics include: 

    • How do we decide today which technologies to employ to ensure our customers, clients, and businesses are amazed? 
    • These technologies are hot prospects right now, but it takes imagination, innovation, visualization, technology thought leadership, and other techniques, as well as the ability to analyze and aggregate data to ensure they can truly deliver amazing experiences. 
    • Best practices for data visualization and why “information in context” matters when these technologies are used. 
    • What role do these technologies play, and how do we cut through their hype, promises, and occasional disasters to effectively leverage them? Can we couple virtual and/or augmented reality within various applications – for mobile or other settings? Where and how can 3D or autostereo play a role and can or do sensors/MEMS/wearables factor in? 
    • Success stories, lessons learned, and some forward thinking about how we can leverage the potential of these technologies

ACM Digital Library Publication:



Overview Page: