“3D point of regard and subject motion from a portable video-based monocular eye tracker” by Kolakowski and Pelz

  • ©Susan M. Kolakowski and Jeff Pelz




    3D point of regard and subject motion from a portable video-based monocular eye tracker



    The ability to determine the motion of an observer and his/her point of regard (POR) in the world can be very beneficial to the study of human visual behavior and cognition. State-of-the-art eye-tracking systems output an observer’s POR in 2D image or screen coordinates. Here we present a novel technique for determining POR in 3D world coordinates using a monocular video-based eye tracker. Currently, there are various techniques to determine an observer’s gaze within an image sequence but many fewer techniques for determining his/her POR in three dimensions. Existing eye-tracking methods for determining a subject’s 3D POR are limited to virtualreality applications and/or can only be used with binocular eye trackers. Techniques that provide 3D information in the real world are limited to gaze direction without precise report on POR.


    1. Bouguet, J., 2007. http://www.vision.caltech.edu/bouguetj/calib_doc/. Camera Calibration Toolbox for Matlab®, Accessed: April 3, 2007.
    2. Hartley, R., and Zisserman, A. 2004. Multiple View Geometry, 2nd ed. Cambridge University Press.
    3. Kolakowski, S. M., and Pelz, J. B. 2006. Compensating for eye tracker camera movement. In Eye Tracking Research and Applications (ETRA) ‘2006, 79–85.
    4. Kovesi, P., 2007. http://www.csse.uwa.edu.au/~pk/research/matlabfns/. The University of Western Australia, Accessed: February 22, 2007.
    5. Repko, J., and Pollefeys, M. 2000. 3D models from extended uncalibrated video sequences: Addressing key-frame selection and projective drift. 3D Digital Imaging and Modeling (3DIM ’05), 00, 150–157.

ACM Digital Library Publication:

Overview Page: