“Unsynchronized structured light”
Conference:
Type(s):
Title:
- Unsynchronized structured light
Session/Category Title: 3D Scanning
Presenter(s)/Author(s):
Abstract:
Various Structured Light (SL) methods are used to capture 3D range images, where a number of binary or continuous light patterns are sequentially projected onto a scene of interest, while a digital camera captures images of the illuminated scene. All existing SL methods require the projector and camera to be hardware or software synchronized, with one image captured per projected pattern. A 3D range image is computed from the captured images. The two synchronization methods have disadvantages, which limit the use of SL methods to niche industrial and low quality consumer applications. Unsynchronized Structured Light (USL) is a novel SL method which does not require synchronization of pattern projection and image capture. The light patterns are projected and the images are captured independently, at constant, but possibly different, frame rates. USL synthesizes new binary images as would be decoded from the images captured by a camera synchronized to the projector, reducing the subsequent computation to standard SL. USL works both with global and rolling shutter cameras. USL enables most burst-mode-capable cameras, such as modern smartphones, tablets, DSLRs, and point-and-shoots, to function as high quality 3D snapshot cameras. Beyond the software, which can run in the devices, a separate SL Flash, able to project the sequence of patterns cyclically, during the acquisition time, is needed to enable the functionality.
References:
1. Bell, T., and Zhang, S. 2014. Towards superfast 3D optical metrology with digital micromirror device (DMD) platforms. In SPIE MOEMS-MEMS, ., 897907–897907.
2. Bradley, D., Atcheson, B., Ihrke, I., and Heidrich, W. 2009. Synchronization and rolling shutter compensation for consumer video camera arrays. In CVPR Workshops, IEEE, 1–8.
3. Fujiyoshi, H., Shimizu, S., Nishi, T., Nagasaka, Y., and Takahashi, T. 2003. Fast 3D position measurement with two unsynchronized cameras. In CIRA, vol. 3, IEEE, 1239–1244.
4. Goddyn, L., and Gvozdjak, P. 2003. Binary gray codes with long bit runs. the electronic journal of combinatorics 10, 1, R27.
5. Gupta, M., Agrawal, A., Veeraraghavan, A., and Narasimhan, S. 2011. Structured light 3D scanning in the presence of global illumination. In CVPR 2011, 713–720.
6. Hasler, N., Rosenhahn, B., Thormahlen, T., Wand, M., Gall, J., and Seidel, H.-P. 2009. Markerless motion capture with unsynchronized moving cameras. In CVPR, ., 224–231.
7. Hu, W., Gu, H., and Pu, Q. 2013. Lightsync: Unsynchronized visual communication over screen-camera links. In Proceedings of the 19th MobiCom, ACM, 15–26.
8. Inokuchi, S., Sato, K., and Matsuda, F. 1984. Range imaging system for 3D object recognition. In Proceedings of the ICPR, vol. 48, 806–808.
9. Jaeggli, T., Koninckx, T. P., and Van Gool, L. 2003. Online 3D acquisition and model integration. In Proc. IEEE Intl Workshop Projector-Camera Systems, 4.
10. Koppal, S. J., Yamazaki, S., and Narasimhan, S. G. 2012. Exploiting DLP illumination dithering for reconstruction and photography of high-speed scenes. IJCV 96, 1, 125–144.
11. Langlotz, T., and Bimber, O. 2007. Unsynchronized 4D barcodes. In Advances in Visual Computing. Springer, 363–374.
12. Moreno, D., and Taubin, G. 2012. Simple, accurate, and robust projector-camera calibration. In 3DIMPVT, ., 464–471.
13. Moreno, D., Son, K., and Taubin, G. 2015. Embedded phase shifting: Robust phase shifting with embedded signals. In CVPR.
14. Nayar, S., Krishnan, G., Grossberg, M. D., and Raskar, R. 2006. Fast Separation of Direct and Global Components of a Scene using High Frequency Illumination. ACM TOG (Jul).
15. O’Toole, m., Mather, J., and Kutulakos, K. N. 2014. 3D shape and indirect appearance by structured light transport. In CVPR, IEEE Conference on, 3246–3253.
16. Posdamer, J., and Altschuler, M. 1982. Surface measurement by space-encoded projected beam systems. Computer Graphics and Image Processing 18, 1, 1–17.
17. Salvi, J., Pags, J., and Batlle, J. 2004. Pattern codification strategies in structured light systems. Pattern Recognition 37, 4, 827–849.
18. Salvi, J., Fernandez, S., Pribanic, T., and Llado, X. 2010. A state of the art in structured light patterns for surface profilometry. Pattern Recognition 43, 8, 2666–2680.
19. Sato, K. 1987. Range imaging system utilizing nematic liquid crystal mask. In Proc. 1st ICCV, 1987, 657–661.
20. Shirai, Y., and Suwa, M. 1971. Recognition of polyhedrons with a range finder. In Proceedings of the 2nd IJCAI, Morgan Kaufmann Publishers Inc., 80–87.
21. Srinivasan, V., Liu, H. C., and Halioua, M. 1985. Automated phase-measuring profilometry: a phase mapping approach. Appl. Opt. 24, 2 (Jan), 185–188.
22. Xu, Y., and Aliaga, D. G. 2007. Robust pixel classification for 3D modeling with structured light. In Proceedings of Graphics Interface 2007, GI ’07, 233–240.
23. Zhang, S. 2010. Recent progresses on real-time 3D shape measurement using digital fringe projection techniques. Optics and lasers in engineering 48, 2, 149–158.


