“The Edge of History” by Starobin
Conference:
SIGGRAPH Video Review:
Track:
- 11
Title:
- The Edge of History
Length:
- 2:00
Director(s):
Company / Institution / Agency:
- GST – NASA/Goddard Space Flight Center
Description:
The disciplines of earth science are just now crossing the threshold of a new era. In almost all aspects of research about our home planet, space-based data collection is beginning to play a principal role, a role that was impossible prior to the still-dawning information revolution.
Scientific visualization reveals in data what would otherwise be invisible. But unlike tangible or directly observable data collected by researchers in situ, remotely collected data present conceptual challenges to non-experts. To the casual viewer, the relevance of uncontextualized scientific visualization can seem arcane at best, irrelevant at worst. In an effort to broaden mainstream understanding and enthusiasm for this kind of work, NASA commissioned this video. Here we visualize the Earth using real data from an orbiting fleet of powerful instruments. Each of the visualizations is based on actual scientific research; nothing here is mere “window dressing.”
Source media for this video were originally delivered in high definition. Visualizers ingested satellite data into Maya or Lightwave; they used RenderMan and other tools in a UNIX render farm. Satellite and rocket models were designed in both Lightwave and Maya. Post-production used After Effects and Final Cut Pro to composite and edit the piece.
Hardware:
HARDWARE: Apple, SGI, IBM workstations (single, dual and multiple) 250 Mhz to 2.8 Ghz CPU, 1 GB to 16 GB RAM. Rendering farm: Up to 50 CPUs.
Software:
SOFTWARE DEVELOPER: Modeling and animation: Lightwave 5.6 and Maya 4/5. Rendering: RenderMan 10/11, Lightwave 5.6 Dynamics: Satellite Tool Kit. Compositing: Final Cut Pro, After Effects. Additional software: RSI Interactive Data Language, Erdas Imagine, Photoshop. Custom software: Stand-alone applications or embedded software to trans- late original scientific data into textures and models. One example includes custom IDL code for taking satellite data and converting them into formats suitable for modeling. OS: Apple OS X, IRIX, RedHat Linux
Additional Contributors:
Producer: Michael Starobin
Contributors: Michael Starobin, Mark Malanoski, Tom Bridgman, Randy Jones, Alex Kekesi, Kevin Mahoney, Horace Mitchell, Marte Newcombe, Lori Perkins, Greg Shirah, Stuart Snodgrass, Eric Sokolowsky, Cindy Starr, Joycelyn K. Thomson, James Williams, Marit Jentoft-Nilsen, Robert Simmon, Jesse Allen, Reto Stockli, Barbara Summey, Fritz Hassler
Editor: Mark Malanoski
Additional Information:
PRODUCTION
Modeling: Satellite sensors captured multiple wavelengths of reflected and emitted light. NASA science teams converted the raw signals into data, and visualizers then turned data into pictures. Scenes using GOES cloud data utilized an automated rotoscoping technique, with infrared and visible light data rotoscoped in a custom-designed process to synchronize the two channels. Rendering technique used most: RenderMan, Lightwave, and Mental Ray on Linux and SGI systems. Average CPU time for rendering per frame: 10 seconds to three days, depending on data complexity and treatment. Total production time: approximately two weeks, following months of principal R&D. Production highlight: These visualizations began their creative development as individual elements that could be understood by national news audiences in 20 seconds or less. One or two visual- izers worked in partnership with scientists and a television producer to create these images, often with heavy constraints on R&D resources. Though challenging, these limitations regularly propelled the develop- ment of innovative technical and aesthetic treatments. The final sequence in this production begins with the visualization of a launch from Cape Canaveral, Florida, using actual satellite data of the Earth, and then proceeds to recreate two famous photos taken respectively from the Apollo 8 and 17 missions to the moon.