MTSAN: Multi-Task Semantic Attention Network for ADAS Applications

Chun-Yu Lai, Bo-Xun Wu, Vinay Malligere Shivanna*, Jiun-In Guo

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

11 Scopus citations

Abstract

This paper presents a lightweight Multi-task Semantic Attention Network (MTSAN) to collectively deal with object detection as well as semantic segmentation aiding real-time applications of the Advanced Driver Assistance Systems (ADAS). This paper proposes a Semantic Attention Module (SAM) that introduces the semantic contextual clues from a segmentation subnet to guide a detection subnet. The SAM significantly boosts up the detection performance and computational cost by considerably decreasing the false alarm rate and it is completely independent of any other parameters. The experimental results show the effectiveness of each component of the network and demonstrate that the proposed MTSAN yields a better balance between accuracy and speed. Following the post-processing methods, the proposed module is tested and proved for its accuracy in the Lane Departure Warning System (LDWS) and Forward Collision Warning System (FCWS). In addition, the proposed lightweight network is deployable on low-power embedded devices to meet the requirements of the real-time applications yielding 10FPS @ 512 X 256 on NVIDIA Jetson Xavier and 15FPS @ 512 X 256 on Texas Instrument's TDA2x.

Original languageEnglish
Pages (from-to)50700-50714
Number of pages15
JournalIEEE Access
Volume9
DOIs
StatePublished - Mar 2021

Keywords

  • Semantics
  • Task analysis
  • Object detection
  • Feature extraction
  • Convolution
  • Image segmentation
  • Proposals
  • Advanced Driver Assistance System (ADAS)
  • detection subnet
  • image segmentation
  • multi-task learning network
  • object detection
  • segmentation subnet
  • semantic attention module (SAM)

Fingerprint

Dive into the research topics of 'MTSAN: Multi-Task Semantic Attention Network for ADAS Applications'. Together they form a unique fingerprint.

Cite this