“Relativistic ultrafast rendering using time-of-flight imaging” by Velten, Wu, Jarabo, Masia, Barsi, et al. …

  • ©Andreas Velten, Di Wu, Adrian Jarabo, Belen Masia, Christopher Barsi, Everett Lawson, Chinmaya Joshi, Diego Gutierrez, Moungi G. Bawendi, and Ramesh Raskar


Abstract:


    We capture ultrafast movies of light in motion and synthesize physically valid visualizations. The effective exposure time for each frame is under two picoseconds (ps). Capturing a 2D video with this time resolution is highly challenging, given the low signal-to-noise ratio (SNR) associated with ultrafast exposures, as well as the absence of 2D cameras that operate at this time scale. We re-purpose modern imaging hardware to record an average of ultrafast repeatable events that are synchronized to a streak tube, and we introduce reconstruction methods to visualize propagation of light pulses through macroscopic scenes.

References:


    1. Naik, N., Zhao, S., Velten, A., Raskar, R., and Bala, K. 2011. Single view reflectance capture using multiplexed scattering and time-of-flight imaging. ACM Trans. Graph. 30 (Dec.), 171:1–171:10.
    2. Velten, A., Willwacher, T., Gupta, O., Veeraraghavan, A., Bawendi, M. G., and Raskar, R. 2012. Recovering three-dimensional shape around a corner using ultrafast time-of-flight imaging. Nature Communications 3, 745.
    3. Wu, D., O’Toole, M., Velten, A., Agrawal, A., and Raskar, R. 2012. Decomposing global light transport using time of flight imaging. In Computer Vision and Pattern Recognition.


ACM Digital Library Publication:



Overview Page: