“Motion parallax in stereo 3D: model and applications” – ACM SIGGRAPH HISTORY ARCHIVES

“Motion parallax in stereo 3D: model and applications”

  • 2016 SA Technical Papers_Kellnhofer_Motion Parallax in Stereo 3D-Model and Applications

Conference:


Type(s):


Title:

    Motion parallax in stereo 3D: model and applications

Session/Category Title:   All About Seeing


Presenter(s)/Author(s):



Abstract:


    Binocular disparity is the main depth cue that makes stereoscopic images appear 3D. However, in many scenarios, the range of depth that can be reproduced by this cue is greatly limited and typically fixed due to constraints imposed by displays. For example, due to the low angular resolution of current automultiscopic screens, they can only reproduce a shallow depth range. In this work, we study the motion parallax cue, which is a relatively strong depth cue, and can be freely reproduced even on a 2D screen without any limits. We exploit the fact that in many practical scenarios, motion parallax provides sufficiently strong depth information that the presence of binocular depth cues can be reduced through aggressive disparity compression. To assess the strength of the effect we conduct psycho-visual experiments that measure the influence of motion parallax on depth perception and relate it to the depth resulting from binocular disparity. Based on the measurements, we propose a joint disparity-parallax computational model that predicts apparent depth resulting from both cues. We demonstrate how this model can be applied in the context of stereo and multiscopic image processing, and propose new disparity manipulation techniques, which first quantify depth obtained from motion parallax, and then adjust binocular disparity information accordingly. This allows us to manipulate the disparity signal according to the strength of motion parallax to improve the overall depth reproduction. This technique is validated in additional experiments.

References:


    1. Bista, S., da Cunha, Í. L. L., and Varshney, A. 2016. Kinetic depth images: Flexible generation of depth perception. The Visual Computer, 1–13.
    2. Bradshaw, M. F., and Rogers, B. J. 1996. The interaction of binocular disparity and motion parallax in the computation of depth. Vision Research 36, 21, 3457–3468. Cross Ref
    3. Bradshaw, M. F., and Rogers, B. J. 1999. Sensitivity to horizontal and vertical corrugations defined by binocular disparity. Vision Research 39, 18, 3049–56. Cross Ref
    4. Bradshaw, M. F., Hibbard, P. B., Parton, A. D., Rose, D., and Langley, K. 2006. Surface orientation, modulation frequency and the detection and perception of depth defined by binocular disparity and motion parallax. Vision Research 46, 17, 2636–2644. Cross Ref
    5. Braunstein, M. L., Hoffman, D. D., and Pollick, F. E. 1990. Discriminating rigid from nonrigid motion: Minimum points and views. Perception & Psychophysics 47, 3, 205–214. Cross Ref
    6. Brox, T., Bruhn, A., Papenberg, N., and Weickert, J. 2004. High accuracy optical flow estimation based on a theory for warping. In European Conference on Computer Vision (ECCV), vol. 3024 of Lecture Notes in Computer Science, 25–36.
    7. Campbell, F., and Maffei, L. 1981. The influence of spatial frequency and contrast on the perception of moving patterns. Vision Research 21, 5, 713–721. Cross Ref
    8. Chapiro, A., Heinzle, S., Aydin, T. O., Poulakos, S., Zwicker, M., Smolic, A., and Gross, M. 2014. Optimizing stereo-to-multiview conversion for autostereoscopic displays. Comp. Graph. Forum (Proc. Eurographics) 33, 2, 63–72.
    9. Chapiro, A., O’Sullivan, C., Jarosz, W., Gross, M., and Smolic, A. 2015. Stereo from shading. In Proc. of EGSR.
    10. Coutant, B. E., and Westheimer, G. 1993. Population distribution of stereoscopic ability. Ophthalmic and Physiological Optics 13, 1, 3–7. Cross Ref
    11. Cutting, J., and Vishton, P. 1995. Perceiving layout and knowing distances: The integration, relative potency, and contextual use of different information about depth. In Perception of Space and Motion (Handbook Of Perception And Cognition), Academic Press, W. Epstein and S. Rogers, Eds., 69–117.
    12. Didyk, P., Ritschel, T., Eisemann, E., Myszkowski, K., and Seidel, H.-P. 2010. Adaptive image-space stereo view synthesis. In Vision, Modeling and Visualization Workshop, 299–306.
    13. Didyk, P., Ritschel, T., Eisemann, E., Myszkowski, K., and Seidel, H.-P. 2011. A perceptual model for disparity. ACM Trans. Graph. (Proc. SIGGRAPH) 30, 4, 96.
    14. Didyk, P., Ritschel, T., Eisemann, E., Myszkowski, K., and Seidel, H.-P. 2012. Apparent stereo: The cornsweet illusion can enhance perceived depth. In IS&T/SPIE Electronic Imaging, International Society for Optics and Photonics, 82910N–82910N.
    15. Didyk, P., Ritschel, T., Eisemann, E., Myszkowski, K., Seidel, H.-P., and Matusik, W. 2012. A luminance-contrast-aware disparity model and applications. ACM Trans. Graph. (Proc. SIGGRAPH) 31, 6, 184.
    16. Domini, F., Caudek, C., and Tassinari, H. 2006. Stereo and motion information are not independently processed by the visual system. Vision Research 46, 11, 1707–1723. Cross Ref
    17. Durgin, F. H., Proffitt, D. R., Olson, T. J., and Reinke, K. S. 1995. Comparing depth from motion with depth from binocular disparity. Journal of Experimental Psychology: Human Perception and Performance 21, 3, 679. Cross Ref
    18. Fattal, R., Lischinski, D., and Werman, M. 2002. Gradient domain high dynamic range compression. ACM Trans. Graph. (Proc. SIGGRAPH) 21, 3, 249–256.
    19. Hoffman, D. M., Girshick, A. R., Akeley, K., and Banks, M. S. 2008. Vergence-accommodation conflicts hinder visual performance and cause visual fatigue. Journal of Vision 8, 3, 33–33. Cross Ref
    20. Hoffman, D. M., Karasev, V. I., and Banks, M. S. 2011. Temporal presentation protocols in stereoscopic displays: Flicker visibility, perceived motion, and perceived depth. Journal of the Society for Information Display 19, 3, 271–297. Cross Ref
    21. Howard, I., and Rogers, B. 2012. Perceiving in Depth. Oxford University Press.
    22. Kral, K. 2003. Behavioural-analytical studies of the role of head movements in depth perception in insects, birds and mammals. Behavioural Processes 64, 1, 1–12. Cross Ref
    23. Lambooij, M., Fortuin, M., Heynderickx, I., and IJsselsteijn, W. 2009. Visual discomfort and visual fatigue of stereoscopic displays: A review. Journal of Imaging Science and Technology 53, 3, 30201–1. Cross Ref
    24. Landy, M. S., Maloney, L. T., Johnston, E. B., and Young, M. 1995. Measurement and modeling of depth cue combination: In defense of weak fusion. Vision Research 35, 3, 389–412. Cross Ref
    25. Lang, M., Hornung, A., Wang, O., Poulakos, S., Smolic, A., and Gross, M. 2010. Nonlinear disparity mapping for stereoscopic 3D. ACM Trans. Graph. (Proc. SIGGRAPH) 29, 4, 75.
    26. Larson, G. W., Rushmeier, H., and Piatko, C. 1997. A visibility matching tone reproduction operator for high dynamic range scenes. IEEE Transactions on Visualization and Computer Graphics 3, 4, 291–306.
    27. Luca, M. D., Domini, F., and Caudek, C. 2007. The relation between disparity and velocity signals of rigidly moving objects constrains depth order perception. Vision Research 47, 10, 1335–1349. Cross Ref
    28. Masia, B., Wetzstein, G., Aliaga, C., Raskar, R., and Gutierrez, D. 2013. Display adaptive 3D content remapping. Computers & Graphics 37, 8, 983–996.
    29. Masia, B., Wetzstein, G., Didyk, P., and Gutierrez, D. 2013. A survey on computational displays: Pushing the boundaries of optics, computation, and perception. Computers & Graphics 37, 8, 1012 — 1038.
    30. McKee, S. P., and Nakayama, K. 1984. The detection of motion in the peripheral visual field. Vision Research 24, 1, 25–32. Cross Ref
    31. Nawrot, M., and Stroyan, K. 2009. The motion/pursuit law for visual depth perception from motion parallax. Vision Research 49, 15, 1969–1978. Cross Ref
    32. Nawrot, M., Ratzlaff, M., Leonard, Z., and Stroyan, K. 2014. Modeling depth from motion parallax with the motion/pursuit ratio. Frontiers in Psychology 5.
    33. Ono, M. E., Rivest, J., and Ono, H. 1986. Depth perception as a function of motion parallax and absolute-distance information. Journal of Experimental Psychology: Human Perception and Performance 12, 3, 331. Cross Ref
    34. Oskam, T., Hornung, A., Bowles, H., Mitchell, K., and Gross, M. 2011. OSCAM – Optimized stereoscopic camera control for interactive 3D. ACM Trans. Graph. 30, 6, 189:1–189:8.
    35. Proffitt, D., and Banton, T. 1999. Perceived depth is enhanced with parallax scanning. Tech. rep., University of Virginia-Cognitive Science Department.
    36. Rogers, B., and Graham, M. 1982. Similarities between motion parallax and stereopsis in human depth perception. Vision Research 22, 2, 261–270. Cross Ref
    37. Stroyan, K. 2010. Motion parallax is asymptotic to binocular disparity. arXivpreprint arXiv:1010.0575.
    38. Szeliski, R. 2010. Computer Vision: Algorithms and Applications. Springer.
    39. Turner, J., Braunstein, M. L., and Andersen, G. J. 1997. Relationship between binocular disparity and motion parallax in surface detection. Perception & Psychophysics 59, 3, 370–380. Cross Ref
    40. Ullman, S. 1983. Maximizing rigidity: The incremental recovery of 3-D structure from rigid and rubbery motion. Perception 13, 255–74. Cross Ref
    41. v3© Imaging, 2015. www.inv3.com.
    42. Wallach, H., and O’Connell, D. N. 1953. The kinetic depth effect. Journal of Experimental Psychology 45, 4, 205. Cross Ref
    43. Wetzstein, G., Lanman, D., Hirsch, M., and Raskar, R. 2012. Tensor Displays: Compressive Light Field Synthesis using Multilayer Displays with Directional Backlighting. ACM Trans. Graph. (Proc. SIGGRAPH) 31, 4, 1–11.
    44. Wikipedia, 2015. Parallax scrolling—Wikipedia, the free encyclopedia. http://en.wikipedia.org/wiki/Parallax_scrolling {Online; accessed 2-June-2015}.
    45. Wikipedia, 2015. Wiggle stereoscopy—Wikipedia, the free encyclopedia. http://en.wikipedia.org/wiki/Wiggle_stereoscopy {Online; accessed 2-June-2015}.
    46. Yap, Y. L., Levi, D. M., and Klein, S. A. 1989. Peripheral positional acuity: Retinal and cortical constraints on 2-dot separation discrimination under photopic and scotopic conditions. Vision Research 29, 7, 789 — 802. Cross Ref
    47. Young, M. J., Landy, M. S., and Maloney, L. T. 1993. A perturbation analysis of depth perception from combinations of texture and motion cues. Vision Research 33, 18, 2685–2696. Cross Ref
    48. Zimmer, H., Bruhn, A., and Weickert, J. 2011. Optic flow in harmony. International Journal of Computer Vision 93, 3, 368–388.
    49. Zwicker, M., Matusik, W., Durand, F., and Pfister, H. 2006. Antialiasing for automultiscopic 3D displays. In Proc. of EGSR, The Eurographics Association.


ACM Digital Library Publication:



Overview Page:



Submit a story:

If you would like to submit a story about this presentation, please contact us: historyarchives@siggraph.org