TY - JOUR
T1 - Intelligent Visual Acuity Estimation System with Hand Motion Recognition
AU - Chiu, Chun Jie
AU - Tien, Yu Chieh
AU - Feng, Kai-Ten
AU - Tseng, Po Hsuan
N1 - Publisher Copyright:
© 2020 Institute of Electrical and Electronics Engineers Inc.. All rights reserved.
PY - 2021/12
Y1 - 2021/12
N2 - Visual acuity (VA) measurement is utilized to test a subject's acuteness of vision. Conventional VA measurement requires a physician's assistance to ask a subject to speak out or wave a hand in response to the direction of an optotype. To avoid this repetitive testing procedure, different types of automatic VA tests have been developed in recent years by adopting contact-based responses, such as pushing buttons or keyboards on a device. However, contact-based testing is not as intuitive as speaking or waving hands, and it may distract the subjects from concentrating on the VA test. Moreover, problems related to hygiene may arise if all the subjects operate on the same testing device. To overcome these problems, we propose an intelligent VA estimation (iVAE) system for automatic VA measurements that assists the subject to respond in an intuitive, noncontact manner. VA estimation algorithms using maximum likelihood (VAML) are developed to automatically estimate the subject's vision by compromising between a prespecified logistic function and a machine-learning technique. The neural-network model adapts human learning behavior to consider the accuracy of recognizing the optotype as well as the reaction time of the subject. Furthermore, a velocity-based hand motion recognition algorithm is adopted to classify hand motion data, collected by a sensing device, into one of the four optotype directions. Realistic experiments show that the proposed iVAE system outperforms the conventional line-by-line testing method as it is approximately ten times faster in testing trials while achieving a logarithm of the minimum angle of resolution error of less than 0.2. We believe that our proposed system provides a method for accurate and fast noncontact automatic VA testing.
AB - Visual acuity (VA) measurement is utilized to test a subject's acuteness of vision. Conventional VA measurement requires a physician's assistance to ask a subject to speak out or wave a hand in response to the direction of an optotype. To avoid this repetitive testing procedure, different types of automatic VA tests have been developed in recent years by adopting contact-based responses, such as pushing buttons or keyboards on a device. However, contact-based testing is not as intuitive as speaking or waving hands, and it may distract the subjects from concentrating on the VA test. Moreover, problems related to hygiene may arise if all the subjects operate on the same testing device. To overcome these problems, we propose an intelligent VA estimation (iVAE) system for automatic VA measurements that assists the subject to respond in an intuitive, noncontact manner. VA estimation algorithms using maximum likelihood (VAML) are developed to automatically estimate the subject's vision by compromising between a prespecified logistic function and a machine-learning technique. The neural-network model adapts human learning behavior to consider the accuracy of recognizing the optotype as well as the reaction time of the subject. Furthermore, a velocity-based hand motion recognition algorithm is adopted to classify hand motion data, collected by a sensing device, into one of the four optotype directions. Realistic experiments show that the proposed iVAE system outperforms the conventional line-by-line testing method as it is approximately ten times faster in testing trials while achieving a logarithm of the minimum angle of resolution error of less than 0.2. We believe that our proposed system provides a method for accurate and fast noncontact automatic VA testing.
KW - Hand motion recognition
KW - Machine learning
KW - Maximum likelihood
KW - Neural networks (NNs)
KW - Visual acuity (VA) estimation
UR - http://www.scopus.com/inward/record.url?scp=85108309215&partnerID=8YFLogxK
U2 - 10.1109/TCYB.2020.2969520
DO - 10.1109/TCYB.2020.2969520
M3 - Article
C2 - 32092028
AN - SCOPUS:85108309215
SN - 2168-2267
VL - 51
SP - 6226
EP - 6239
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 12
ER -