Video-based eye tracking for autostereoscopic displays

Yong-Sheng Chen*, Chan Hung Su, Jiun Hung Chen, Chu Song Chen, Yi Ping Hung, Chiou Shann Fuh

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

An autostereoscopic display system can provide users the great enjoyment of stereo visualization without the uncomfortable and inconvenient drawbacks of wearing stereo glasses or head-mounted displays. To render stereo video with respect to the user's viewpoints and to accurately project stereo video onto the user's eyes, the left and right eye positions of the user, who is allowed to move around freely, have to be obtained when the user is watching the autostereoscopic display. We present real-time tracking techniques that can efficiently provide the user's eye positions in images. These techniques comprise: 1. face detection by using multiple eigenspaces of various lighting conditions, 2. fast block matching for tracking four motion parameters (X and Y translation, scaling, and rotation) of the user's face, and 3. eye locating in the obtained face region. According to our implementation on a PC with a Pentium III 700 MHz CPU, the frame rate of the eye tracking process can achieve 30 Hz.

Original languageEnglish
Pages (from-to)2726-2734
Number of pages9
JournalOptical Engineering
Volume40
Issue number12
DOIs
StatePublished - 1 Dec 2001

Keywords

  • Autostereoscopic displays
  • Eye tracking
  • Face tracking

Fingerprint

Dive into the research topics of 'Video-based eye tracking for autostereoscopic displays'. Together they form a unique fingerprint.

Cite this