Eye Tracking Based Control System for Natural Human-Computer Interaction

Xuebai Zhang, Xiao Long Liu*, Shyan-Ming Yuan, Shu Fan Lin

*此作品的通信作者

研究成果: Article同行評審

19 引文 斯高帕斯(Scopus)

摘要

Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

原文English
文章編號5739301
期刊Computational Intelligence and Neuroscience
2017
DOIs
出版狀態Published - 1 一月 2017

指紋

深入研究「Eye Tracking Based Control System for Natural Human-Computer Interaction」主題。共同形成了獨特的指紋。

引用此