“InTheMix” by Chapin

  • ©William L. Chapin

Conference:


Entry Number: 15

Title:


    InTheMix

Presenter(s):


Collaborator(s):



Description:


    Imagine standing in the middle of a music mix, quite literally! Your aural environment reacts to your every move, everything you look at, and everything you say. InTheMix employs real-time, digital computation of 3D sound-field models to synthesize an aural environment for presentation via headphones. The aural environment reacts to the listeners’ interest, which is inferred from the their position and orientation relative to the sounds they hear. The context-dependent content evolves from attention-coaxing sound effects and voices to a selection of production mixes of original multi-track music. Each audio track represents an instrument or vocalist, with an independent dynamic position and orientation in the sound space.

    Listeners are allowed to physically roam within a two-meter-radius circle. If they venture past the edge of the virtual space, voices and sounds encourage them to return. While each node of InTheMix may be compelling on its own, several nodes can be linked to form a shared environment for remote participants. Microphones in the headsets allow participants to communicate and share their experiences as if they were physically in the same space.


Other Information:


    Executive Producer
    William L. Chapin

    Performance
    John Tindel
    Geoff Rutledge
    Lennox Smith
    Rachel Wilkinson

    Production
    Agnieszka Roginska
    Lennox Smith
    Rachel Wilkinson

    Programming
    William Chapin
    Agnieszka Roginska
    Rachel Wilkinson
    Hua Zheng
    AuSIM, Incorporated

    Parametric Synthesis
    Robin Bargar
    Hua Zheng
    National Center for
    Supercomputing
    Applications


PDF:



Overview Page:


Type:


E-Tech Type: