“Optimizing vision and visuals: lectures on cameras, displays and perception” Chaired by

  • ©Koray Kavakli, David R. Walton, Nick Antipa, Rafal K. Mantiuk, Douglas Lanman, and Kaan Akşit

Conference:


Type:


Entry Number: 16

Title:

    Optimizing vision and visuals: lectures on cameras, displays and perception

Presenter(s)/Author(s):



Abstract:


    The evolution of the internet is underway, where immersive virtual 3D environments (commonly known as metaverse or telelife) will replace flat 2D interfaces. Crucial ingredients in this transformation are next-generation displays and cameras representing genuinely 3D visuals while meeting the human visual system’s perceptual requirements.

    This course will provide a fast-paced introduction to optimization methods for next-generation interfaces geared towards immersive virtual 3D environments. Firstly, we will introduce lensless cameras for high dimensional compressive sensing (e.g., single exposure capture to a video or one-shot 3D). Our audience will learn to process images from a lensless camera at the end. Secondly, we introduce holographic displays as a potential candidate for next-generation displays. By the end of this course, you will learn to create your 3D images that can be viewed using a standard holographic display. Lastly, we will introduce perceptual guidance that could be an integral part of the optimization routines of displays and cameras. Our audience will gather experience in integrating perception to display and camera optimizations.

    This course targets a wide range of audiences, from domain experts to newcomers. To do so, examples from this course will be based on our in-house toolkit to be replicable for future use. The course material will provide example codes and a broad survey with crucial information on cameras, displays and perception.

References:


    1. K. Kavakli, H. Urey, and K. Akşit. Learned holographic light transport. Applied Optics, 61(5): B50–B55, 2022.
    2. K. Kavakli, Y. Itoh, H. Urey, and K. Akşit. Realistic defocus blur for multiplane computer-generated holography, 2022. URL https://arxiv.org/abs/2205.07030.
    3. O. Kingshott, N. Antipa, E. Bostan, and K. Akşit. Unrolled primal-dual networks for lensless cameras. arXiv preprint arXiv:2203.04353, 2022.
    4. J. Orlosky, M. Sra, K. Bektaş, H. Peng, J. Kim, N. Kos’myna, T. Höllerer, A. Steed, K. Kiyokawa, and K. Akşit. Telelife: The future of remote living. Frontiers in Virtual Reality, 2, 2021. ISSN 2673–4192. URL https://www.frontiersin.org/article/10.3389/frvir.2021.763340. 
    5. D. R. Walton, R. K. Dos Anjos, S. Friston, D. Swapp, K. Akşit, A. Steed, and T. Ritschel. Beyond blur: Real-time ventral metamers for foveated rendering. ACM Transactions on Graphics, 40(4): 1–14, 2021.
    6. D. R. Walton, K. Kavakli, R. K. d. Anjos, D. Swapp, T. Weyrich, H. Urey, A. Steed, T. Ritschel, and K. Akşit. Metameric varifocal holography. In 2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 2022.

ACM Digital Library Publication:



Overview Page: