摘要
Over the decades, fitness activities and extreme endurance events are expanding throughout the world. The number of available public skeletal repositories and recognition/evaluation benchmarks has grown rapidly since Microsoft manufactured a motion sensing device called Kinect. Kinect RGBD data has become a very useful representation of an indoor scene for solving activity/fitness recognition problems. The other alternative sensor which has been utilized widely in this area is the wearable inertial measurement unit (IMU) sensor. With numerous advance sensors with mass adoption, this technology represents a possible approach to surpass current activity recognition and evaluation research solutions. Nevertheless, there is a limited number of publicly available datasets where depth camera, inertial sensor, and RGB image data are captured at the same time. In this paper, we introduce NCTU-MFD (National Chiao Tung University Multisensor Fitness Dataset), a comprehensive, diverse multisensor dataset collected using Kinect RGBD sensor, wearable inertial sensors, and web cameras. The dataset contains 47131 RGB images, 47131 depth images, and 100 csv files including 47131 skeletal data (from 25 joints) collected from Kinect sensor. In addition, our dataset also contains acceleration and gyroscope data from IMU sensors, and 94262 RGB images (47131 images from each web camera). To demonstrate the possible use of our dataset, we conduct an experiment on evaluation of depth maps.
原文 | American English |
---|---|
頁數 | 4 |
DOIs | |
出版狀態 | Published - 18 9月 2019 |
事件 | 20th Asia-Pacific Network Operations and Management Symposium, APNOMS 2019 - Matsue, 日本 持續時間: 18 9月 2019 → 20 9月 2019 |
Conference
Conference | 20th Asia-Pacific Network Operations and Management Symposium, APNOMS 2019 |
---|---|
國家/地區 | 日本 |
城市 | Matsue |
期間 | 18/09/19 → 20/09/19 |