“Prakash: lighting aware motion capture using photosensing markers and multiplexed illuminators” by Raskar, Nii, deDecker, Hashimoto, Summet, et al. …

  • ©


Abstract:


    In this paper, we present a high speed optical motion capture method that can measure three dimensional motion, orientation, and incident illumination at tagged points in a scene. We use tracking tags that work in natural lighting conditions and can be imperceptibly embedded in attire or other objects. Our system supports an unlimited number of tags in a scene, with each tag uniquely identified to eliminate marker reacquisition issues. Our tags also provide incident illumination data which can be used to match scene lighting when inserting synthetic elements. The technique is therefore ideal for on-set motion capture or real-time broadcasting of virtual sets.Unlike previous methods that employ high speed cameras or scanning lasers, we capture the scene appearance using the simplest possible optical devices – a light-emitting diode (LED) with a passive binary mask used as the transmitter and a photosensor used as the receiver. We strategically place a set of optical transmitters to spatio-temporally encode the volume of interest. Photosensors attached to scene points demultiplex the coded optical signals from multiple transmitters, allowing us to compute not only receiver location and orientation but also their incident illumination and the reflectance of the surfaces to which the photosensors are attached. We use our untethered tag system, called Prakash, to demonstrate methods of adding special effects to captured videos that cannot be accomplished using pure vision techniques that rely on camera images.

References:


    1. Azizoglu, M. Y., Salehi, J. A., and Li, Y. 1992. Optical CDMA via Temporal Codes. IEEE Transactions on Communications 40, 7, 1162–1170.Google ScholarCross Ref
    2. Basri, R., and Jacobs, D. 2001. Photometric Stereo with General, Unknown Lighting. IEEE CVPR 02, 374–379.Google Scholar
    3. Bayindir, M., Sorin, F., Abouraddy, A. F., Viens, J., Hart, S. D., Joannopoulos, J. D., and Fink, Y. 2004. Metal-insulator-semiconductor Optoelectronic Fibres. Nature 431, 826–829.Google ScholarCross Ref
    4. Bhat, K. S., Twigg, C. D., Hodgins, J. K., Khosla, P. K., Popovic, Z., and Seitz, S. M. 2003. Estimating Cloth Simulation Parameters from Video. In Symposium on Computer animation (SCA), 37–51. Google ScholarDigital Library
    5. Bolas, M. T., McDowall, I. E., Hoberman, P., and Fisher, S. S., 2004. Snared Illumination.Google Scholar
    6. Codamotion, 2007. Charnwood Dynamics Ltd. http://www.charndyn.com/.Google Scholar
    7. Cotting, D., Naef, M., Gross, M., and Fuchs, H. 2004. Embedding imperceptible patterns into projected images for simultaneous acquisition and display. In International Symposium on Mixed and Augmented Reality (ISMAR ’04), 100–109. Google ScholarDigital Library
    8. Hamamatsu Photonics, 2005. Si PIN Photodiode S6560.Google Scholar
    9. Hightower, J., and Borriello, G. 2001. Location Systems for Ubiquitous Computing. Computer 34, 8, 57–66. Google ScholarDigital Library
    10. Iltanen, M., Kosola, H., Palovuori, K., and Vanhala, J. 1998. Optical Positioning and Tracking System for a Head Mounted Display Based on Spread Spectrum Technology. In 2nd International Conference on Machine Automation (ICMA), 597–608.Google Scholar
    11. Kang, S., and Tesar, D. 2004. Indoor GPS Metrology System with 3D Probe for Precision Applications. In Proceedings of ASME IMECE 2004 International Mechanical Engineering Congress and RD&D Expo.Google Scholar
    12. Lee, J. C., Hudson, S. E., Summet, J. W., and Dietz, P. H. 2005. Moveable Interactive Projected Displays using Projector-based Tracking. In ACM symposium on User interface software and technology (UIST), 63–72. Google ScholarDigital Library
    13. LightPointe Communications, 2007. Free Space Optics. http://www.freespaceoptics.org/.Google Scholar
    14. Ma, H., and Paradiso, J. A. 2002. The FindIT Flashlight: Responsive Tagging Based on Optically Triggered Microprocessor Wakeup. In Ubicomp, 160–167. Google ScholarDigital Library
    15. Matsushita, N., Hihara, D., Ushiro, T., Yoshimura, S., Rekimoto, J., and Yamamoto, Y. 2003. ID CAM: A Smart Camera for Scene Capturing and ID Recognition. In International Symposium on Mixed and Augmented Reality (ISMAR), 227–234. Google ScholarDigital Library
    16. Motion Analysis Corporation, 2006. Hawk-I Digital System.Google Scholar
    17. Nii, H., Sugimoto, M., and Inami, M. 2005. Smart Light Ultra High Speed Projector for Spatial Multiplexing Optical Transmission. Procams Workshop (held with IEEE CVPR). Google ScholarDigital Library
    18. Nii, H., Summet, J., Zhao, Y., Westhues, J., Dietz, P., Nayar, S., Barnwell, J., Noland, M., Branzoi, V., Bruns, E., Inami, M., and Raskar, R. 2006. Instant replay using high speed motion capture and projected overlay. In ACM SIGGRAPH Emerging Technologies, 111. Google ScholarDigital Library
    19. Optotrak, 2007. NDI Optotrak Certus Spatial Measurement. http://www.ndigital.com/certus.php.Google Scholar
    20. Palovuori, K., J., V., and M., K. 2001. Shadowtrack: A Novel Tracking System Based on Spreadspectrum Spatio-Temporal Illumination. Presence – Teleoperators and Virtual Environments. 9:6 (December), 581–592. Google ScholarDigital Library
    21. Patel, S. N., and Abowd, G. D. 2003. A 2-Way Laser-Assisted Selection Scheme for Handhelds in a Physical Environment. In Ubicomp, 200–207.Google Scholar
    22. Phase Space Inc, 2007. Impulse Camera. http://www.phasespace.com.Google Scholar
    23. PTI INC, 2006. VisualEyez VZ 4000.Google Scholar
    24. Raskar, R., Beardsley, P., van Baar, J., Wang, Y., Dietz, P., Lee, J., Leigh, D., and Willwacher, T. 2004. RFIG Lamps: Interacting with a Self-describing World via Photosensing Wireless Tags and Projectors. ACM Transactions on Graphics (SIGGRAPH) 23, 3 (Aug.), 406–415. Google ScholarDigital Library
    25. Ringwald, M. 2002. Spontaneous Interaction with Everyday Devices Using a PDA Workshop on Supporting Spontaneous Interaction in Ubiquitous Computing Settings. In UbiComp.Google Scholar
    26. Robertson, B. 2006. Big moves. Computer Graphics World 29, 11 (Nov).Google Scholar
    27. Silicon Light Machines, 2006. Gated Light Valve. http://www.siliconlight.com.Google Scholar
    28. Smith, J., Sample, A., Powledge, P., Roy, S., and Mamishev, A. 2006. A wirelessly-powered platform for sensing and computation. In Ubicomp. Google ScholarDigital Library
    29. Sorensen, B. R., Donath, M., Yang, G.-B., and Starr, R. C. 1989. The Minnesota Scanner: A Prototype Sensor for Three-dimensional Tracking of Moving Body Segments. IEEE Transactions on Robotics and Automation 45, 4, 499–509.Google ScholarCross Ref
    30. ViconPeak, 2006. Camera MX 40. http://www.vicon.com/products/mx40.html.Google Scholar
    31. Welch, G., and Bishop, G. 1997. SCAAT: Incremental Tracking with Incomplete Information. In Proceedings of SIGGRAPH 97, Computer Graphics Proceedings, Annual Conference Series, 333–344. Google ScholarDigital Library
    32. Welch, G., and Foxlin, E. 2002. Motion Tracking: No Silver Bullet, but a Respectable Arsenal. IEEE Comput. Graph. Appl. 22, 6, 24–38. Google ScholarDigital Library


ACM Digital Library Publication:



Overview Page: