TY - GEN
T1 - Cyclops
T2 - 33rd Annual CHI Conference on Human Factors in Computing Systems, CHI 2015
AU - Chan, Li-Wei
AU - Hsieh, Chi Hao
AU - Chen, Yi Ling
AU - Yang, Shuo
AU - Huang, Da-Yuan
AU - Liang, Rong Hao
AU - Chen, Bing Yu
N1 - Publisher Copyright:
© Copyright 2015 ACM.
PY - 2015/4/18
Y1 - 2015/4/18
N2 - This paper presents Cyclops, a single-piece wearable device that sees its user's whole body postures through an egocentric view of the user that is obtained through a fisheye lens at the center of the user's body, allowing it to see only the user's limbs and interpret body postures effectively. Unlike currently available body gesture input systems that depend on external cameras or distributed motion sensors across the user's body, Cyclops is a single-piece wearable device that is worn as a pendant or a badge. The main idea proposed in this paper is the observation of limbs from a central location of the body. Owing to the ego-centric view, Cyclops turns posture recognition into a highly controllable computer vision problem. This paper demonstrates a proof-of-concept device and an algorithm for recognizing static and moving bodily gestures based on motion history images (MHI) and a random decision forest (RDF). Four example applications of interactive bodily workout, a mobile racing game that involves hands and feet, a full-body virtual reality system, and interaction with a tangible toy are presented. The experiment on the bodily workout demonstrates that, from a database of 20 body workout gestures that were collected from 20 participants, Cyclops achieved a recognition rate of 79% using MHI and simple template matching, which increased to 92% with the more advanced machine learning approach of RDF.
AB - This paper presents Cyclops, a single-piece wearable device that sees its user's whole body postures through an egocentric view of the user that is obtained through a fisheye lens at the center of the user's body, allowing it to see only the user's limbs and interpret body postures effectively. Unlike currently available body gesture input systems that depend on external cameras or distributed motion sensors across the user's body, Cyclops is a single-piece wearable device that is worn as a pendant or a badge. The main idea proposed in this paper is the observation of limbs from a central location of the body. Owing to the ego-centric view, Cyclops turns posture recognition into a highly controllable computer vision problem. This paper demonstrates a proof-of-concept device and an algorithm for recognizing static and moving bodily gestures based on motion history images (MHI) and a random decision forest (RDF). Four example applications of interactive bodily workout, a mobile racing game that involves hands and feet, a full-body virtual reality system, and interaction with a tangible toy are presented. The experiment on the bodily workout demonstrates that, from a database of 20 body workout gestures that were collected from 20 participants, Cyclops achieved a recognition rate of 79% using MHI and simple template matching, which increased to 92% with the more advanced machine learning approach of RDF.
KW - Ego-centric view
KW - Full-body gesture input
KW - Posture recognition
KW - Single-point wearable devices
UR - http://www.scopus.com/inward/record.url?scp=84951121900&partnerID=8YFLogxK
U2 - 10.1145/2702123.2702464
DO - 10.1145/2702123.2702464
M3 - Conference contribution
AN - SCOPUS:84951121900
T3 - Conference on Human Factors in Computing Systems - Proceedings
SP - 3001
EP - 3010
BT - CHI 2015 - Proceedings of the 33rd Annual CHI Conference on Human Factors in Computing Systems
PB - Association for Computing Machinery
Y2 - 18 April 2015 through 23 April 2015
ER -