Person Tracking by Fusing Posture Data from UAV Video and Wearable Sensors

Alisher Mukashev*, Lan Da Van, Susanta Sharma, M. Farhan Tandia, Yu Chee Tseng

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

In this article, a novel framework that fuses the posture data taken by a drone (or unmanned aerial vehicle, UAV) camera and the wearable sensors data recorded by smartwatches is proposed. The framework is designed for continuously tracking persons in a drone view by analyzing location-independent human posture features and correctly tagging smartwatch identities (IDs) and personal profiles to video human objects, thus conquering the former work in requiring ground markers. Person detection, ID assignment, and pose estimation are integrated into our framework to obtain recognized human postures. These recognized postures are then paired with those from the wearable sensors. Through fusing common postures, such as standing, walking, jumping, and falling down, person tracking accuracy by UAV up to 95.36% can be attained in our testing scenarios.

Original languageEnglish
Pages (from-to)24150-24160
Number of pages11
JournalIEEE Sensors Journal
Volume22
Issue number24
DOIs
StatePublished - 15 Dec 2022

Keywords

  • Action recognition
  • data fusion
  • drone
  • person identification
  • person tracking
  • wearable devices

Fingerprint

Dive into the research topics of 'Person Tracking by Fusing Posture Data from UAV Video and Wearable Sensors'. Together they form a unique fingerprint.

Cite this