TY - GEN
T1 - EagleEYE
T2 - 29th European Conference on Networks and Communications, EuCNC 2020
AU - Ardiansyah, Muhammad Febrian
AU - William, Timothy
AU - Abdullaziz, Osamah Ibrahiem
AU - Wang, Li-Chun
AU - Tien, Po-Lung
AU - Yuang, Maria C.
N1 - Publisher Copyright:
© 2020 IEEE.
Copyright:
Copyright 2020 Elsevier B.V., All rights reserved.
PY - 2020/6
Y1 - 2020/6
N2 - The fifth generation (5G) mobile network has paved the way for innovations across vertical industries. The integration of distributed intelligent edge into the 5G orchestrated architecture brings the benefits of low-latency and automation. A successful example of this integration is exhibited by the 5G-DIVE project, which aims at proving the technical merits and business value proposition of vertical industries such as autonomous drone surveillance and navigation. In this paper, and as part of 5G-DIVE, we present an aerial disaster relief system, called EagleEYE, which utilizes edge computing and machine learning to detect emergency situations in real-time. EagleEYE reduces training time by devising an object fusion mechanism which enables reusing existing datasets. Furthermore, EagleEYE parallelizes the detection tasks to enable real-time response. Finally, EagleEYE is evaluated in a real-world testbed and the results show that EagleEYE can reduce the inference latency by 90% with a high detection accuracy of 87%.
AB - The fifth generation (5G) mobile network has paved the way for innovations across vertical industries. The integration of distributed intelligent edge into the 5G orchestrated architecture brings the benefits of low-latency and automation. A successful example of this integration is exhibited by the 5G-DIVE project, which aims at proving the technical merits and business value proposition of vertical industries such as autonomous drone surveillance and navigation. In this paper, and as part of 5G-DIVE, we present an aerial disaster relief system, called EagleEYE, which utilizes edge computing and machine learning to detect emergency situations in real-time. EagleEYE reduces training time by devising an object fusion mechanism which enables reusing existing datasets. Furthermore, EagleEYE parallelizes the detection tasks to enable real-time response. Finally, EagleEYE is evaluated in a real-world testbed and the results show that EagleEYE can reduce the inference latency by 90% with a high detection accuracy of 87%.
KW - Container
KW - Edge computing
KW - Low-latency computing
KW - Object detection
UR - http://www.scopus.com/inward/record.url?scp=85093823532&partnerID=8YFLogxK
U2 - 10.1109/EuCNC48522.2020.9200963
DO - 10.1109/EuCNC48522.2020.9200963
M3 - Conference contribution
AN - SCOPUS:85093823532
T3 - 2020 European Conference on Networks and Communications, EuCNC 2020
SP - 321
EP - 325
BT - 2020 European Conference on Networks and Communications, EuCNC 2020
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 15 June 2020 through 18 June 2020
ER -