“View-Dependent Textured Splatting for Rendering Live Scenes” by Guinnip, Lai and Yang

  • ©David T. Guinnip, Shuhua Lai, and Ruigang Yang

  • ©David T. Guinnip, Shuhua Lai, and Ruigang Yang

  • ©David T. Guinnip, Shuhua Lai, and Ruigang Yang



Entry Number: 051


    View-Dependent Textured Splatting for Rendering Live Scenes



    This sketch presents a novel approach for rendering low resolution point clouds with multiple high resolution textures–the type of data commonly generated by real-time vision systems. The low precision, noisy, and sometimes incomplete nature of such data sets is not suitable for existing point-based rendering techniques that are designed to work with high precision and high density point clouds.
    Our new algorithm–View-dependent Textured Splatting (VDTS)– combines traditional splatting with a view-dependent texturing strategy to increase rendering quality of low resolution data sets with high resolution textures. VDTS requires no pre-processing, addresses texture visibility and anti-aliasing on the fly, and can be efficiently accelerated by commodity graphics hardware. Therefore it is well suited for real-time rendering of dynamic scenes online.


    1. Buehler, C., Bosse, M., McMillan, L., Gortler, S., and Cohen, M. 2001. Unstructured Lumigraph Rendering. In Proceedings of SIGGRAPH 2001, 43–54.
    2. Debevec, P. E., Borshukov, G., and Yu, Y. 1998. Efficient View-Dependent Image-Based Rendering with Projective Texture-Mapping. In 9th Eurographics Rendering Workshop, 427–433.
    3. Matusik, W., Pfister, H., Ngan, A., Beardsley, P., Ziegler, R., and McMillan, L. 2002. Image-Based 3D Photography using Opacity Hulls. In Proceedings of SIGGRAPH 2002.
    4. Segal, M., Korobkin, C., van Widenfelt, R., Foran, J., and Haeberli, P. 1992. Fast Shadows and Lighting Effects Using Texture Mapping. In Proceedings of SIGGRAPH 1992, 249–252.

Additional Images:

©David T. Guinnip, Shuhua Lai, and Ruigang Yang


ACM Digital Library Publication:

Overview Page: