“Real-Time Omnidirectional Stereo Rendering: Generating 360° Surround-View Panoramic Images for Comfortable Immersive Viewing” by Marrinan and Papka – ACM SIGGRAPH HISTORY ARCHIVES

“Real-Time Omnidirectional Stereo Rendering: Generating 360° Surround-View Panoramic Images for Comfortable Immersive Viewing” by Marrinan and Papka

  • ©

Conference:


Type(s):


Interest Area:


    New Technologies

Title:

    Real-Time Omnidirectional Stereo Rendering: Generating 360° Surround-View Panoramic Images for Comfortable Immersive Viewing

Session/Category Title:   TVCG Session on VR


Presenter(s)/Author(s):



Abstract:


    Surround-view panoramic images and videos have become a popular form of media for interactive viewing on mobile devices and virtual reality headsets. Viewing such media provides a sense of immersion by allowing users to control their view direction and experience an entire environment. When using a virtual reality headset, the level of immersion can be improved by leveraging stereoscopic capabilities. Stereoscopic images are generated in pairs, one for the left eye and one for the right eye, and result in providing an important depth cue for the human visual system. For computer generated imagery, rendering proper stereo pairs is well known for a fixed view. However, it is much more difficult to create omnidirectional stereo pairs for a surround-view projection that work well when looking in any direction. One major drawback of traditional omnidirectional stereo images is that they suffer from binocular misalignment in the peripheral vision as a user’s view direction approaches the zenith / nadir (north / south pole) of the projection sphere. This paper presents a real-time geometry-based approach for omnidirectional stereo rendering that fits into the standard rendering pipeline. Our approach includes tunable parameters that enable pole merging – a reduction in the stereo effect near the poles that can minimize binocular misalignment. Results from a user study indicate that pole merging reduces visual fatigue and discomfort associated with binocular misalignment without inhibiting depth perception.

References:


    1.
    J. Ardouin, A. Lécuyer, M. Marchal and E. Marchand, “Stereoscopic rendering of virtual environments with wide field-of-views up to 360°”, 2014 IEEE Virtual Reality (VR), pp. 3-8, 2014.

    2.
    [online] Available: https://www.babylonjs.com/.

    3.
    S. Best, “Perceptual and oculomotor implications of interpupillary distance settings on a head-mounted virtual display”, Proceedings of the IEEE 1996 National Aerospace and Electronics Conference NAECON 1996, vol. 1, pp. 429-434, 1996.

    4.
    Blender 2.83 Manual. Rendering >> render output >> stereoscopy>> usage, [online] Available: https://docs.blender.org/manual/en/latest/render/output/stereoscopy/usage.html#stereo-3d-camera.

    5.
    D. Bodington, J. Thatte and M. Hu, “Rendering of stereoscopic 360° views from spherical image pairs”, Department of Electrical Engineering Stanford University, 2015.

    6.
    K. R. Boff and J. E. Lincoln, “Engineering data compendium: Human perception and performance”, Armstrong Aerospace Medical Research Laboratory Wright-Patterson Air Force Base OH, 1988.

    7.
    P. Bourke, H. Zha, Z. Pan, H. Thwaites, A. C. Addison and M. Forte, “Synthetic stereoscopic panoramic images” in Interactive Technologies and Sociotechnical Systems (VSMM 2006), Berlin, Heidelberg:Springer, vol. 4270, pp. 147-155, 2006.

    8.
    [online] Available: https://docs.chaosgroup.com/display/VMAX/VRayStereoscopic.

    9.
    R. G. de, A. Azevedo, N. Birkbeck, F. De Simone, I. Janatra, B. Adsumilli, et al., “Visual distortions in 360° videos”, IEEE Transactions on Circuits and Systems for Video Technology, vol. 30, no. 8, pp. 2524-2537, 2020.

    10.
    A. Gaggioli and R. Breining, “Perception and cognition in immersive virtual reality” in Emerging Communication: Studies on New Technologies and Practices in Communication, IOS Press, 2001.

    11.
    Rendering omni-directional stereo content, [online] Available: https://developers.google.com/vr/jump/rendering-ods-content.pdf.

    12.
    H. Grasberger, “Introduction to stereo rendering”, Student Project, 2008.

    13.
    J. Heller and T. Pajdla, “Stereographic rectification of omnidirectional stereo pairs”, 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1414-1421, 2009.

    14.
    Y. Lee, Y. Kim, H. Kang and S. Lee, “Binocular depth perception of stereoscopic 3d line drawings”, Proceedings of the ACM Symposium on Applied Perception SAP’ 13, pp. 31-34, 2013.

    15.
    H. Lorenz and J. Döllner, “Dynamic mesh refinement on GPU using geometry shaders”, Proceedings of the 16th International Conference in Central Europe on Computer Graphics Visualization and Computer Vision (WSCG), pp. 97-104, 2008.

    16.
    K. Petkov, C. Papadopoulos, M. Zhang, A. E. Kaufman and X. Gu, “Interactive visibility retargeting in vr using conformal visualization”, IEEE Transactions on Visualization and Computer Graphics, vol. 18, no. 7, pp. 1027-1040, 2012.

    17.
    L. B. Rosenberg, “The effect of interocular distance upon operator performance using stereoscopic displays to perform virtual depth tasks”, Proceedings of IEEE Virtual Reality Annual International Symposium, pp. 27-32, 1993.

    18.
    A. Simon and S. Beckhaus, “Omnidirectional stereo surround for panoramic virtual environments” in ACM SIGGRAPH 2003 Sketches & Applications SIGGRAPH ‘03, New York, NY, USA:Association for Computing Machinery, pp. 1, 2003.

    19.
    M. Trapp, H. Lorenz and J. Döllner, “Interactive stereo rendering for non-planar projections of 3d virtual environments”, 2009 International Conference on Computer Graphics Theory and Applications (GRAPP), pp. 199-204, 2009.

    20.
    C. W. Tyler, L. T. Likova, K. Atanassov, V. Ramachandra, S. Goma, B. E. Rogowitz, et al., “3D discomfort from vertical and torsional disparities in natural images” in Human Vision and Electronic Imaging XVII, International Society for Optics and Photonics, SPIE, vol. 8291, pp. 212-220, 2012.

    21.
    Yonggao Yang, J. X. Chen and M. Beheshti, “Nonlinear perspective projections and magic lenses: 3D view deformation”, IEEE Computer Graphics and Applications, vol. 25, no. 1, pp. 76-84, 2005.


ACM Digital Library Publication:



Overview Page:



Submit a story:

If you would like to submit a story about this presentation, please contact us: historyarchives@siggraph.org