“Pushing the Limits: Crafting an Immersive Mega-Canvas for Phish?s Music Shows at Sphere?” – ACM SIGGRAPH HISTORY ARCHIVES

“Pushing the Limits: Crafting an Immersive Mega-Canvas for Phish?s Music Shows at Sphere?”

  • ©

Conference:


Type(s):


Title:

    Pushing the Limits: Crafting an Immersive Mega-Canvas for Phish?s Music Shows at Sphere?

Presenter(s)/Author(s):



Abstract:


    Moment Factory is excited to invite attendees to a production session at SIGGRAPH 2024, where we’ll delve into the innovative production of the American rock band Phish concert series at Sphere?, a massive entertainment venue located near the Las Vegas Strip. This multimedia mega-canvas is unlike any other, and our presentation will cover the collaborative effort, creative challenges, and technical breakthroughs achieved within a demanding three-month timeline.

    Collaborative Efforts Behind the Scenes
    Central to the success of this project was the tight collaboration between artists and technologists. This multidisciplinary cooperation was crucial in tackling the complexities of producing content for one of the largest and most sophisticated screen surfaces ever built. Our session will showcase the collective ingenuity and problem-solving that brought our vision to life.

    Creative Challenges
    In just three months, our team managed to conceptualize, design, produce, and integrate over 8 hours of custom content, working under the motto of “building the plane as we fly it.” This required a rapid and flexible production approach. We’ll discuss the creative and logistical hurdles of crafting four distinct shows that captured the band’s spontaneous “jam” essence. These shows required a production style that could adapt dynamically to the band?s improvisational performances, without relying on traditional time-coded cues.

    Content Strategy
    Sphere??s dome-like screen demanded content with exceptional detail in ultra-high resolutions, pushing our team to new creative heights. We developed a modular show-flow incorporating various thematic elements, from takeover and graphic to abstract moments, designed to maximize pixel space usage. This strategy aimed to deliver a range of sensations and create immersive, illusionary effects. We’ll outline how we balanced real-time and pre-rendered scenes and the pivotal role of Generative AI (Gen-AI) in accelerating our production pace and enhancing the visual narrative.

    Technical Challenges
    The project required pushing the limits of what’s possible with cutting-edge technology. We’ll explain how we utilized real-time engines like Notch and Unreal Engine to deliver smooth content at 16K resolution and 29.97fps. Our session will also cover the integration of Gen-AI-generated assets into our workflow, particularly through a multi-modular Diffusion model using ComfyUI. We faced numerous challenges with Gen-AI, including managing input types (images, texts, videos), structural control, upscaling techniques, and using detailers/refiners to enhance the detail in generated outputs.

    Visual ‘Jamming’
    To align with the band’s improvisational style, we developed a ‘VJing-style’ control toolbox. This toolbox, controllable externally through the grandMA lighting console, facilitated visual ‘jamming’ in sync with the band’s live performances. We will discuss the effective use of this toolbox with layered pre-rendered content.

    Conclusion
    Our session aims to highlight Moment Factory?s dedication to innovation in location-based and large-scale multimedia experiences. We hope to inspire fellow creators and technologists to explore new creative territories and push beyond the familiar.


ACM Digital Library Publication:



Overview Page:



Submit a story:

If you would like to submit a story about this presentation, please contact us: historyarchives@siggraph.org