Eye Tracking Based Control System for Natural Human-Computer Interaction

Xuebai Zhang, Xiao Long Liu*, Shyan-Ming Yuan, Shu Fan Lin

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

19 Scopus citations


Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

Original languageEnglish
Article number5739301
JournalComputational Intelligence and Neuroscience
StatePublished - 1 Jan 2017


Dive into the research topics of 'Eye Tracking Based Control System for Natural Human-Computer Interaction'. Together they form a unique fingerprint.

Cite this