Look at Me! Correcting eye gaze in live video communication

Chih Fan Hsu, Yu-Shuen Wang, Chin Laung Lei, Kuan Ta Chen

研究成果: Article同行評審

2 引文 斯高帕斯(Scopus)

摘要

Although live video communication is widely used, it is generally less engaging than face-to-face communication because of limitations on social, emotional, and haptic feedback. Missing eye contact is one such problem caused by the physical deviation between the screen and camera on a device. Manipulating video frames to correct eye gaze is a solution to this problem. In this article, we introduce a system to rotate the eyeball of a local participant before the video frame is sent to the remote side. It adopts a warping-based convolutional neural network to relocate pixels in eye regions. To improve visual quality, we minimize the L2 distance between the ground truths and warped eyes. We also present several newly designed loss functions to help network training. These new loss functions are designed to preserve the shape of eye structures and minimize color changes around the periphery of eye regions. To evaluate the presented network and loss functions, we objectively and subjectively compared results generated by our system and the state-of-the-art, DeepWarp, in relation to two datasets. The experimental results demonstrated the effectiveness of our system. In addition, we showed that our system can perform eye-gaze correction in real time on a consumer-level laptop. Because of the quality and efficiency of the system, gaze correction by postprocessing through this system is a feasible solution to the problem of missing eye contact in video communication.

原文English
文章編號a38
頁(從 - 到)1-21
頁數21
期刊ACM Transactions on Multimedia Computing, Communications and Applications
15
發行號2
DOIs
出版狀態Published - 5月 2019

指紋

深入研究「Look at Me! Correcting eye gaze in live video communication」主題。共同形成了獨特的指紋。

引用此