“An immersive, multi-user, musical stage environment” by Reynolds, Schoner, Richards, Dobson and Gershenfeld

  • ©Matt Reynolds, Bernd Schoner, Joey Richards, Kelly Dobson, and Neil Gershenfeld

Conference:


Type:


Title:

    An immersive, multi-user, musical stage environment

Presenter(s)/Author(s):



Abstract:


    A multi-user, polyphonic sensor stage environment that maps position and gestures of up to four performers to the pitch and articulation of distinct notes is presented. The design seeks to provide multiple players on a stage with the feeling of a traditional acoustic instrument by giving them complete control over the instrument’s expressive parameters and a clear causal connection between their actions and the resulting sound. The positions of the performers are determined by a custom ultrasonic tracking system, while hand motions are measured by custom-made gloves containing accelerometer units. Furthermore, juggling clubs are illuminated dynamically to make complex juggling patterns more apparent. The system is currently on tour with the Flying Karamazov Brothers juggling troupe.

References:


    1. David Bianciardi and Ryan Ulyate. Interactive Dance Club, 1998. ACM SIGGRAPH.
    2. Ben Denckla and Patrick Pelletier. The technical documentation for Rogus McBogus, a MIDI library, 1996. Available at http://www.media.mit.edu/hyperins/rogus/home.html.
    3. M. Harrington E. Foxlin and G. Pfeifer. Constellation(tm): A wide-range wireless motion-tracking system for augmented reality and virtual set applications. In M. F. Cohen, editor, SIGGRAPH, Annual Conference on Computer Graphics & Interactive Techniques, 1998.
    4. L. Vicci S. Brumback K. Keller G. Welch, G. Bishop and D. n. Colucci. The hiball tracker: High-performance widearea tracking for virtual and augmented environments. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology, pages 1-11, London, 1999. ACM SIG- GRAPH, Addison-Wesley.
    5. Ralf Gelhaar. Sound=space: an interactive musical environement. Contemporary Music Review, 6(1):59-72, 1991.
    6. Neil Gershenfeld. The Nature of Mathematical Modeling. Cambridge University Press, New York, 1999.
    7. O. Omojola, R. Post, M. Hancher, Y. Maguire, R. Pappu, B. Schoner, P. Russo, R. Fletcher, and N. Gershenfeld. An installation of interactive furniture. IBM Systems Journal, pages 861-879, 2000.
    8. J. Paradiso, K. Hsiao, and A. Benbassat. Interfacing the foot: Apparatus and applications. In Proceedings of the ACM CHI Conference. Extended Abstracts, pages 175-176. 2000.
    9. J. Paradiso and F. Sparacino. Optical tracking for music and dance performance. In A. Gruen and H. Kahmen, editors, Optical 3-D measurement techniques IV, pages 11-18. Herbert Wichmann Verlag, Heidelberg, 1997.
    10. Joseph A. Paradiso. Electronic music interfaces: New ways to play. IEEE Spectrum Magazine, 34(12):18-30, 1997.
    11. Joseph A. Paradiso. The brain opera technology: New instruments and gestural sensors for musical interaction and performance. Journal of New Music Research, 28(2):130-149, 1999.
    12. Michel Waisvisz. The hands. In Proceedings International Computer Music Conference, pages 313-318, 1985.
    13. M. Wanderly and M. Battier, editors. Trends in Gestural Control of Music. Editions IRCAM, Paris, 2000.
    14. G. Welch and G. Bishop. Scaat: Incremental tracking with incomplete information. In T. Whitted, editor, Annual Conference on Computer Graphics & Interactive Techniques, pages 333-344. SIGGRAPH 97, ACM Press, Addison-Wesley, 1997.


ACM Digital Library Publication:



Overview Page: