“The WarpEngine: an architecture for the post-polygonal age” by Popescu, Eyles, Lastra, Steinhurst, England, et al. …

  • ©Voicu Popescu, John G. Eyles, Anselmo Lastra, Joshua Steinhurst, Nick England, and Lars Nyland




    The WarpEngine: an architecture for the post-polygonal age



    We present the WarpEngine, an architecture designed for real-time imaged-based rendering of natural scenes from arbitrary viewpoints. The modeling primitives are real-world images with per-pixel depth. Currently they are acquired and stored off-line; in the near future real-time depth-image acquisition will be possible, the WarpEngine is designed to render in immediate mode from such data sources.
    The depth-image resolution is locally adapted by interpolation to match the resolution of the output image. 3D warping can occur either before or after the interpolation; the resulting warped/interpolated samples are forward-mapped into a warp buffer, with the precise locations recorded using an offset. Warping processors are integrated on-chip with the warp buffer, allowing efficient, scalable implementation of very high performance systems. Each chip will be able to process 100 million samples per second and provide 4.8GigaBytes per second of bandwidth to the warp buffer.
    The WarpEngine is significantly less complex than our previous efforts, incorporating only a single ASIC design. Small configurations can be packaged as a PC add-in card, while larger deskside configurations will provide HDTV resolutions at 50 Hz, enabling radical new applications such as 3D television.
    WarpEngine will be highly programmable, facilitating use as a test-bed for experimental IBR algorithms.


    1. Aliaga D. and Lastra A., “Automatic Image Placement to Provide a Guaranteed Frame Rate”, Proc. SIGGRAPH ‘99,307-316 (1999).
    2. Beier T. and Neely S., “Feature-Based Image Metamorphosis”, Proc. SIGGRAPH ’92, 35-42 (1992).
    3. Beraldin J.-A., Rioux M., Blais F., Domey J., and Coumoyer L., “Registered Range and Intensity Imaging at 10-Mega Samples per Second”, Opt. Eng., 31(1): p. 88-94 (1992).
    4. Chen S. and Williams L., “View Interpolation for Image Synthesis”, Proc. SIGGRAPH ’93, 279-288 (1993).
    5. Chen S., “Quicktime VR – An Image-Based Approach to Virtual Environment Navigation”, Proc. SIGGRAPH ’95, 29-38 (1995).
    6. The Cyrax System, in http://www.cyra.com/.
    7. Debevec P., Taylor C., and Malik J., “Modeling and Rendering Architecture from Photographs: A Hybrid Geometry and Image-Based Approach”, Proc. SIGGRAPH ’96, 11-20 (1996).
    8. Ellsworth D., Polygon Rendering for Interactive Visualization on Multicomputers, PhD thesis, University of North Carolina at Chapel Hill, 1997.
    9. Gortler S., Grzeszczuk R., Szeliski R., and Cohen M., “The Lumigraph”, Proc. SIGGRAPH ’96, 43-54 (1996).
    10. Kanade T., Rander P., Vedula S., and Saito H., “Virtualized Reality: Digitizing a 3D Time-Varying Event As Is and in Real Time”, Mixed Reality, Merging Real and Virtual Worlds, Y. Ohta and H. Tamura, Editors. Springer-Verlag. p. 41-57 (1999).
    11. Scene Modeler, http://www.k2t.com/.
    12. Levoy M. and Hanrahan P., “Light Field Rendering”, Proc. SIGGRAPH ’96, 31-42 (1996).
    13. Mark W., McMillan L., and Bishop G., “Post-Rendering 3D Warping”, 1997 Symposium on Interactive 3D Graphics, 7-16 (1997).
    14. Mark W., Post-Rendering 3D Image Warping: Visibility, Reconstruction, and Performance for Depth-Image Warping, PhD thesis, University of North Carolina at Chapel Hill, 1999.
    15. McAllister, D., Nyland L., Popescu V., Lastra A., McCue C., “Real-Time Rendering of Real-World Environments”, Rendering Techniques ’99, Proc. Eurographics Workshop on Rendering, 145- 160, (1999).
    16. McMillan L. and Bishop G., “Plenoptic Modeling: An Image-Based Rendering System”, Proc. SIGGRAPH ’95, 39-46 (1995).
    17. McMillan L., An Image-Based Approach to Three- Dimensional Computer Graphics, PhD thesis, University of North Carolina at Chapel Hill, 1997.
    18. Minolta 3D 1500, in http://www.minolta3d.com/.
    19. Molnar S., Eyles J., and Poulton J., “PixelFlow: High-speed Rendering using Image Composition”, Proc. SIGGRAPH ’92, 231- 240(1992).
    20. Molnar S., Cox M., Ellsworth D., and Fuchs H., “A Sorting Classification of Parallel Rendering”, IEEE Computer Graphics and Aplications, 14(4), 23-32 (1994)
    21. Montrym J., Baum D., Dignam D., and Migdal C., “InfiniteReality: A Real-Time Graphics System”, Proc. SIGGRAPH ‘97,293-302 (1997).
    22. Mueller C., “The Sort-First Rendering Architecture for High- Performance Graphics”, 1995 Symposium on Interactive 3D Graphics, 75-84 (1995).
    23. Olano M. and Lastra A., “A Shading Language on Graphics Hardware: The PixelFlow Shading System”, Proc. SIGGRAPH 98, (1998).
    24. Regan M., Miller G., Rubin S., and Kogelnik C., “A Real Time Low-Latency Hardware Light-Field Renderer”, Proc. SIGGRAPH ‘99,287-290 (1999).
    25. Seitz S. and Dyer C., “View Morphing: Synthesizing 3D Metamorphoses Using Image Transforms”, Proc. SIGGRAPH ’96, 21-30(1996).
    26. Shade J., Gortler S., He L., and Szeliski R., “Layered Depth Images”, Proc. SIGGRAPH ’98, 231-242 (1998).
    27. Torborg J. and Kajiya J., “Talisman: Commodity Real-time 3D Graphics for the PC”,Proc. SIGGRAPH ‘96,353-364 (1996).
    28. Westover L., “Footprint Evaluation for Volume Rendering”, Proc. SIGGRAPH ’90, 367-376 (1990).
    29. Wolberg G., Digital Image Warping, IEEE Computer Society Press, Los Alamitos California, 1990.

ACM Digital Library Publication: