“Foveated AR: dynamically-foveated augmented reality display” by Kim, Jeong, Stengel, Akşit, Albert, et al. …
Conference:
Type(s):
Title:
- Foveated AR: dynamically-foveated augmented reality display
Session/Category Title: VR and AR
Presenter(s)/Author(s):
Abstract:
We present a near-eye augmented reality display with resolution and focal depth dynamically driven by gaze tracking. The display combines a traveling microdisplay relayed off a concave half-mirror magnifier for the high-resolution foveal region, with a wide field-of-view peripheral display using a projector-based Maxwellian-view display whose nodal point is translated to follow the viewer’s pupil during eye movements using a traveling holographic optical element. The same optics relay an image of the eye to an infrared camera used for gaze tracking, which in turn drives the foveal display location and peripheral nodal point. Our display supports accommodation cues by varying the focal depth of the microdisplay in the foveal region, and by rendering simulated defocus on the “always in focus” scanning laser projector used for peripheral display. The resulting family of displays significantly improves on the field-of-view, resolution, and form-factor tradeoff present in previous augmented reality designs. We show prototypes supporting 30, 40 and 60 cpd foveal resolution at a net 85° × 78° field of view per eye.
References:
1. G. Abadie. 2018. A Life of a Bokeh. SIGGRAPH Course: Advances in real-time rendering in games part 1.Google Scholar
2. K. Akşit, W. Lopes, J. Kim, P. Shirley, and D. Luebke. 2017. Near-Eye Varifocal Augmented Reality Display using See-Through Screens. In Proc. of SIGGRAPH Asia.Google Scholar
3. R. Albert, A. Patney, D. Luebke, and J. Kim. 2017. Latency requirements for foveated rendering in virtual reality. ACM TAP 14, 4 (2017). Google ScholarDigital Library
4. S. Anstis. 1974. A chart demonstrating variations in acuity with retinal position. Vision research 14, 7 (1974).Google Scholar
5. A. Bahill, M. Clark, and L. Stark. 1975. The main sequence, a tool for studying human eye movements. Mathematical Biosciences 24, 3–4 (1975).Google ScholarCross Ref
6. D. Baldwin. 1981. Area of interest: Instantaneous field of view vision model. In Image Generation/Display Conference.Google Scholar
7. R. Baloh, A. Sills, W. Kumley, and Vi. Honrubia. 1975. Quantitative measurement of saccade amplitude, duration, and velocity. Neurology 25, 11 (1975).Google Scholar
8. S. Bharadwaj and C. Schor. 2005. Acceleration characteristics of human ocular accommodation. Vision Research 45, 1 (2005).Google Scholar
9. M. Bukowski, P. Hennessy, B. Osman, and M. McGuire. 2013. The Skylanders SWAP Force Depth-of-Field Shader. In GPU Pro 4: Advanced Rendering Techniques.Google Scholar
10. O. Cakmakci and J. Rolland. 2006. Head-worn displays: a review. Journal of display technology 2, 3 (2006).Google ScholarCross Ref
11. F. Campbell and G. Westheimer. 1960. Dynamics of accommodation responses of the human eye. J. Physiol. 151, 2 (1960).Google ScholarCross Ref
12. S. Cholewiak, G. Love, P. Srinivasan, R. Ng, and M. Banks. 2017. ChromaBlur: Rendering chromatic eye aberration improves accommodation and realism. ACM TOG 36, 6 (2017). Google ScholarDigital Library
13. R. Cook, T. Porter, and L. Carpenter. 1984. Distributed Ray Tracing. In Proc. of SIGGRAPH. Google ScholarDigital Library
14. Magic Leap Corportation. 2019a. Magic Leap One Creator Edition. https://www.magicleap.com/magic-leap-one. Accessed: 2019-01-12.Google Scholar
15. nReal Corportation. 2019b. nReal Light. https://www.nreal.ai/. Accessed: 2019-01-12.Google Scholar
16. D. Dunn, C. Tippets, K. Torell, P. Kellnhofer, K. Akşit, P. Didyk, K. Myszkowski, D. Luebke, and H. Fuchs. 2017. Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors. IEEE TVCG 23, 4 (2017). Google ScholarDigital Library
17. D. Elliott, K. Yang, and D. Whitaker. 1995. Visual acuity changes throughout adulthood in normal, healthy eyes: seeing beyond 6/6. Optometry and vis. science 72, 3 (1995).Google Scholar
18. W. Fuhl, D. Geisler, T. Santini, T. Appel, W. Rosenstiel, and E. Kasneci. 2018. CBF: Circular Binary Features for Robust and Real-time Pupil Center Detection. In ACM Symposium on Eye Tracking Research & Applications. Google ScholarDigital Library
19. W. Fuhl, T. Kübler, K. Sippel, W. Rosenstiel, and E. Kasneci. 2015. Excuse: Robust pupil detection in real-world scenarios. In Computer Analysis of Images and Patterns.Google Scholar
20. W. Fuhl, T. Santini, G. Kasneci, W. Rosenstiel, and E.Kasneci. 2017. PupilNet v2.0: Convolutional Neural Networks for CPU based real time Robust Pupil Detection. CoRR abs/1711.00112 (2017). http://arxiv.org/abs/1711.00112Google Scholar
21. Y. Gu and G. Legge. 1987. Accommodation to stimuli in peripheral vision. JOSA A 4, 8 (1987).Google Scholar
22. B. Guenter, M. Finch, S. Drucker, D. Tan, and J. Snyder. 2012. Foveated 3D Graphics. ACM TOG 31, 6 (2012). Google ScholarDigital Library
23. P. Haeberli and K. Akeley. 1990. The Accumulation Buffer: Hardware Support for High-quality Rendering (Proc. of SIGGRAPH). Google ScholarDigital Library
24. G. Heron, W. Charman, and C. Schor. 2001. Dynamics of the accommodation response to abrupt changes in target vergence as a function of age. Vision Res. 41, 4 (2001).Google Scholar
25. M.-L. Hsieh and K.Y. Hsu. 2001. Grating detuning effect on holographic memory in photopolymers. Optical Engineering 40, 10 (2001).Google Scholar
26. X. Hu and H. Hua. 2014. High-resolution optical see-through multi-focal-plane head-mounted display using freeform optics. Optics Express 22, 11 (2014).Google Scholar
27. H. Hua. 2017. Enabling Focus Cues in Head-Mounted Displays. Proc. IEEE 105, 5 (2017).Google ScholarCross Ref
28. H. Hua and B. Javidi. 2014. A 3D integral imaging optical see-through head-mounted display. Optics Express 22, 11 (2014).Google Scholar
29. M. Ibbotson and S. Cloherty. 2009. Visual perception: saccadic omission, suppression or temporal masking? Current Biology 19, 12 (2009).Google Scholar
30. C. Jang, K. Bang, G. Li, and B. Lee. 2018. Holographic near-eye display with expanded eye-box. In Proc. of SIGGRAPH Asia. Google ScholarDigital Library
31. C. Jang, K. Bang, S. Moon, J. Kim, S. Lee, and B. Lee. 2017. Retinal 3D: Augmented Reality Near-eye Display via Pupil-tracked Light Field Projection on Retina. In Proc. of SIGGRAPH Asia. Google ScholarDigital Library
32. J. Kim, M. Stengel, A. Majercik, S. De Mello, S. Laine, M. McGuire, and D. Luebke. 2019. NVGaze: Low-Latency, Near-Eye Gaze Estimation with an Anatomically-Informed Dataset. In Proc. of CHI.Google Scholar
33. J. Kim, Q. Sun, F. Huang, L. Wei, D. Luebke, and A. Kaufman. 2017. Perceptual Studies for Foveated Light Field Displays. arXiv preprint arXiv:1708.06034 (2017).Google Scholar
34. S.-B. Kim and J.-H. Park. 2018. Optical see-through Maxwellian near-to-eye display with an enlarged eyebox. Optics Letters 43, 4 (2018).Google ScholarCross Ref
35. H. Kogelnik. 1969. Coupled wave theory for thick hologram gratings. Bell System Technical Journal 48, 9 (1969).Google ScholarCross Ref
36. A. Koulieris, M.and Mantiuk R. Akşit, K.and Stengel, K. Mania, and C. Richardt. 2019. Near-Eye Display and Tracking Technologies for Virtual and Augmented Reality. In Computer Graphics Forum, Vol. 38.Google Scholar
37. B. Kress and W. Cummings. 2017. Towards the Ultimate Mixed Reality Experience: HoloLens Display Architecture Choices. In SID Symp. Digest of Technical Papers.Google Scholar
38. G. Lee, J. Hong, S. Hwang, S. Moon, H. Kang, S. Jeon, J. Kim, H.and Jeong, and B. Lee. 2018b. Metasurface eyepiece for augmented reality. Nature comm. 9, 1 (2018).Google Scholar
39. J. S. Lee, Y. K. Kim, M. Y. Lee, and Y. H. Won. 2019. Enhanced see-through near-eye display using time-division multiplexing of a Maxwellian-view and holographic display. Optics Express 27, 2 (2019).Google Scholar
40. S. Lee, J. Cho, B. Lee, Y. Jo, C. Jang, D. Kim, and B. Lee. 2018a. Foveated retinal optimization for see-through near-eye multi-layer displays. IEEE Access 6 (2018).Google Scholar
41. S. Lee, Y. Jo, D. Yoo, J. Cho, D. Lee, and B. Lee. 2018c. TomoReal: Tomographic Displays. arXiv preprint arXiv:1804.04619 (2018).Google Scholar
42. J. Lemley, A. Kar, A. Drimbarean, and P. Corcoran. 2018. Efficient CNN Implementation for Eye-Gaze Estimation on Low-Power/Low-Quality Consumer Imaging Systems. arXiv preprint arXiv:1806.10890 (2018).Google Scholar
43. S. Liu, D. Cheng, and H. Hua. 2008. An optical see-through head mounted display with addressable focal planes. In Mixed and Augmented Reality. Google ScholarDigital Library
44. S. Liu, H. Hua, and D. Cheng. 2010. A Novel Prototype for an Optical See-Through Head-Mounted Display with Addressable Focus Cues. IEEE TVCG 16, 3 (2010). Google ScholarDigital Library
45. L. Loschky and G. Wolverton. 2007. How late can you update gaze-contingent multiresolutional displays without detection? ACM TOMM 3, 4 (2007). Google ScholarDigital Library
46. A. Maimone, A. Georgiou, and J. Kollin. 2017. Holographic Near-eye Displays for Virtual and Augmented Reality. In Proc. of SIGGRAPH. Google ScholarDigital Library
47. A. Maimone, D. Lanman, K. Rathinavel, K. Keller, D. Luebke, and H. Fuchs. 2014. Pinlight Displays: Wide Field of View Augmented Reality Eyeglasses Using Defocused Point Light Sources. ACM Trans. Graph. 33, 4 (2014). Google ScholarDigital Library
48. M. Mansouryar, J. Steil, Y. Sugano, and A. Bulling. 2016. 3d gaze estimation from 2d pupil positions on monocular head-mounted eye trackers. In Proc. of the Symp. on Eye Tracking Research & Applications. Google ScholarDigital Library
49. E. Matin. 1974. Saccadic suppression: a review and an analysis. Psychological bulletin 81, 12 (1974).Google Scholar
50. O. Mercier, Y. Sulai, K. Mackenzie, M. Zannoli, J. Hillis, D. Nowrouzezahrai, and D. Lanman. 2017. Fast Gaze-contingent Optimal Decompositions for Multifocal Displays. ACM TOG 36, 6 (2017). Google ScholarDigital Library
51. K. Miki, T. Nagamatsu, and D. Hansen. 2016. Implicit user calibration for gaze-tracking systems using kernel density estimation. In Proce. of the Symp. on Eye Tracking Research & Applications. Google ScholarDigital Library
52. G. Mlot, H. Bahmani, S. Wahl, and E. Kasneci. 2016. 3D Gaze Estimation using Eye Vergence.. In HEALTHINF. 125–131. Google ScholarDigital Library
53. R. Narain, R. Albert, A. Bulbul, Gr. Ward, M. Banks, and J. O’Brien. 2015. Optimal presentation of imagery with focus cues on multi-plane displays. ACM TOG 34, 4 (2015). Google ScholarDigital Library
54. S. Phillips, D. Shirachi, and L. Stark. 1972. Analysis of accommodative response times using histogram information. American Journal of Optometry 49, 5 (1972).Google ScholarCross Ref
55. S. Reder. 1973. On-line monitoring of eye-position signals in contingent and noncontingent paradigms. Behavior Research Methods & Instrumentation 5, 2 (1973).Google Scholar
56. R. Rodieck. 1998. The first steps in seeing. Sinauer Associates Sunderland, MA.Google Scholar
57. J. Rolland, A. Yoshida, L. Davis, and J. Reif. 1998. High-resolution inset head-mounted display. Applied optics 37, 19 (1998).Google Scholar
58. T. Santini, W. Fuhl, and E. Kasneci. 2017. CalibMe: Fast and Unsupervised Eye Tracker Calibration for Gaze-Based Pervasive Human-Computer Interaction. In Proc of CHI. Google ScholarDigital Library
59. K. Selgrad, C. Reintges, D. Penk, P. Wagner, and M. Stamminger. 2015. Real-time Depth of Field Using Multi-layer Filtering. In Proc of I3D. Google ScholarDigital Library
60. M. Shenker. 1987. Optical design criteria for binocular helmet-mounted displays. In Display System Optics.Google Scholar
61. L. Shi, F. Huang, W. Lopes, W. Matusik, and D. Luebke. 2017. Near-eye Light Field Holographic Rendering with Spherical Waves for Wide Field of View Interactive 3D Computer Graphics. In Proc. of SIGGRAPH Asia. Google ScholarDigital Library
62. M. Shinya. 1994. Post-filtering for Depth of Field Simulation with Ray Distribution Buffer. In Proc. of Graphics Interface.Google Scholar
63. T. Sousa. 2013. CryEngine3 Graphics Gems. SIGGRAPH Course: Advances in Real-Time Rendering Course.Google Scholar
64. A Spooner. 1982. The trend towards area of interest in visual simulation technology. Technical Report. Naval Training Equipment Center Orlando FL.Google Scholar
65. G. Tan, Y.-H. Lee, T. Zhan, J. Yang, S. Liu, D. Zhao, and S.-T. Wu. 2018. Foveated imaging for near-eye displays. Optics Express 26, 19 (2018).Google ScholarCross Ref
66. L. Thibos, D. Still, and A. Bradley. 1996. Characterization of spatial aliasing and contrast sensitivity in peripheral vision. Vision research 36, 2 (1996).Google Scholar
67. M. Tonsen, J. Steil, Y. Sugano, and A. Bulling. 2017. InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation. In Proc. Interact. Mob. Wearable Ubiquitous Technol. Google ScholarDigital Library
68. B. Wang, K. Ciuffreda, and T. Irish. 2006. Equiblur zones at the fovea and near retinal periphery. Vision Research 46, 21 (2006).Google Scholar
69. L. Xiao, A. Kaplanyan, A.r Fix, M. Chapman, and D. Lanman. 2018. DeepFocus: learned image synthesis for computational displays. In Proc. of SIGGRAPH Asia. Google ScholarDigital Library
70. Y. Yang, H. Lin, Z. Yu, S. Paris, and Ji. Yu. 2016. Virtual DSLR: High Quality Dynamic Depth-of-Field Synthesis on Mobile Platforms. In Digital Phot. and Mob. Imaging.Google Scholar
71. T. Zhan, Y. Lee, and S. Wu. 2018. High-resolution additive light field near-eye display by switchable Pancharatnam-Berry phase lenses. Optics Express 26, 4 (2018).Google Scholar