“4D view synthesis: navigating through time and space” by Sun, Schindler, Kang and Dellaert
Conference:
Type(s):
Title:
- 4D view synthesis: navigating through time and space
Presenter(s)/Author(s):
Abstract:
In this sketch, we present a 4D view synthesis technique for rendering large-scale 3D structures evolving in time, given a sparse sample of historical images. We built a system to visualize urban structure that is a function of the time selected, thereby allowing virtual navigation in space and time. While there is a rich literature on image-based rendering of static 3D environments, e.g., the Facade system [Debevec et al. 1996] and Photo Tourism [Snavely et al. 2006], little has been done to address the temporal aspect (e.g., oc- clusion due to temporal change). We construct time-dependent geometry to handle the sparse sampling. To render, we use time-and-view dependent texture mapping and reason about visibility both it time and space. Figure 1 shows the result of view synthesis based on time-dependent geometry.
References:
1. Debevec, P. E., Taylor, C. J., and Malik, J. 1996. Modeling and rendering architecture from photographs: A hybrid geometry- and image-based approach. In Proceedings of SIGGRAPH 2000, ACM Press / ACM SIGGRAPH, vol. 30, ACM, 11–20.
2. Schindler, G., Dellaert, F., and Kang, S. B. 2007. Inferring temporal order of images from 3D structure. In IEEE Conf. on Computer Vision and Pattern Recognition (CVPR), IEEE.
3. Snavely, N., Seitz, S., and Szeliski, R. 2006. Photo tourism: Exploring photo collections in 3D. In Proceedings of SIGGRAPH 2000, ACM Press / ACM SIGGRAPH, ACM.