“Emotion & Affect In VR” Moderated by Denise Quesnel

  • ©Jacquelyn (Jacki) Ford Morie, Kent Bye, Tabitha Peck, and Jonathan Gratch

Conference:


Type:


Entry Number: 03

Title:

    Emotion & Affect In VR

Presenter(s)/Author(s):


Moderator(s):



Abstract:


    CURATED CONTENT 

    There is tremendous interest in the potential of virtual reality (VR) and virtual environments to facilitate changes in a participant’s feelings, or emotional state. Feelings are a subjective experience, and emotions are the display of feelings. For measurability and user testing, researchers often rely on incidences of “affect”, a conscious subjective feeling that may connect to a designed feature of a virtual system. For creators of immersive experiences, it is important to understand the psychological and cognitive properties of emotions and affect if we wish to reliably facilitate specific feelings and replicate them in other users. 

    This panel discusses human psychology and cognition as it relates to how people interact with virtual systems, and how emotional states and affect can be collected and analyzed. It focuses on concepts of “embodiment” and why it is key to the facilitation of emotion and affect. A convincing state of embodiment is key to evoking virtual-body-ownership illusions and a sense of “being there” (presence), which are goals of many VR games, therapeutic applications, training applications, and immersive films. Designers and researchers confront several challenges as they create realistic virtual embodiment, including believability, system design, and image quality.

    The panelists are pioneers in virtual-environment research and creation. Their work started long before consumer VR was available, and it emphasizes the need to understand human behavior and communication. In conversation and a Q&A session, they review tools designers can use to measure and collect data on emotion and affect, facilitation of feelings and emotions in an ethical manner, and the role of “empathy” and “sympathy” in a VR context.


Overview Page: