“Reconstructionless Airborne Radiance Fields” by Praschl, Böss and Schedl – ACM SIGGRAPH HISTORY ARCHIVES

“Reconstructionless Airborne Radiance Fields” by Praschl, Böss and Schedl

  • ©

Conference:


Type(s):


Title:

    Reconstructionless Airborne Radiance Fields

Session/Category Title:   Images, Video & Computer Vision


Presenter(s)/Author(s):



Abstract:


    In this study, we present a pipeline that facilitates the conversion of log files created during UAV flights to effectively harness image and sensor data to train NeRF-like models in only two seconds, eliminating the requirement for computationally intensive image-based reconstructions, which take over 500 min for the same scene.

    INVITED TO THE FIRST ROUND OF THE STUDENT RESEARCH COMPETITION

References:


    [1]
    Bernhard Kerbl, Georgios Kopanas, Thomas Leimkuehler, and George Drettakis. 2023. 3D Gaussian Splatting for Real-Time Radiance Field Rendering. ACM Trans. Graph. 42, 4 (2023). https://doi.org/10.1145/3592433

    [2]
    Ben Mildenhall, Pratul P. Srinivasan, Matthew Tancik, Jonathan T. Barron, Ravi Ramamoorthi, and Ren Ng. 2021. NeRF: representing scenes as neural radiance fields for view synthesis. Commun. ACM 65, 1 (2021). https://doi.org/10.1145/3503250

    [3]
    Johannes L Sch?nberger. 2018. Robust methods for accurate and efficient 3D modeling from unstructured imagery. Ph. D. Dissertation. ETH Zurich.

    [4]
    Matthew Tancik, Ethan Weber, Evonne Ng, Ruilong Li, Brent Yi, Terrance Wang, Alexander Kristoffersen, Jake Austin, Kamyar Salahi, Abhik Ahuja, David Mcallister, Justin Kerr, and Angjoo Kanazawa. 2023. Nerfstudio: A Modular Framework for Neural Radiance Field Development. In ACM SIGGRAPH Conference Proceedings. https://doi.org/10.1145/3588432.3591516


ACM Digital Library Publication:



Overview Page:



Submit a story:

If you would like to submit a story about this presentation, please contact us: historyarchives@siggraph.org