Person Tracking by Fusing Posture Data from UAV Video and Wearable Sensors

Alisher Mukashev, Lan Da Van, Susanta Sharma, M. Farhan Tandia, Yu Chee Tseng

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, a novel framework that fuses the posture data taken by a drone (or unmanned aerial vehicle, UAV) camera and the wearable sensors data recorded by smartwatches is proposed. The framework is designed for continuously tracking persons in a drone view by analyzing location-independent human posture features and correctly tagging smartwatch identities (IDs) and personal profiles to video human objects, thus conquering the former work in requiring ground markers. Person detection, ID assignment, and pose estimation are integrated into our framework to obtain recognized human postures. These recognized postures are then paired with those from the wearable sensors. Through fusing common postures such as standing, walking, jumping, and falling down, person tracking accuracy by UAV up to 95.36% can be attained in our testing scenarios.

Original languageEnglish
Pages (from-to)1
Number of pages1
JournalIEEE Sensors Journal
DOIs
StateAccepted/In press - 2022

Keywords

  • Action recognition
  • Autonomous aerial vehicles
  • Cameras
  • data fusion
  • drone
  • Drones
  • person identification
  • person tracking
  • Sensors
  • Skeleton
  • Target tracking
  • wearable devices
  • Wearable sensors

Fingerprint

Dive into the research topics of 'Person Tracking by Fusing Posture Data from UAV Video and Wearable Sensors'. Together they form a unique fingerprint.

Cite this