SENSOR FUSION OF CAMERA AND MMW RADAR BASED ON MACHINE LEARNING FOR VEHICLES

Yi Horng Lai, Yu Wen Chen, Jau Woei Perng*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

This study develops a forward collision warning system for vehicles based on sensor fusion of a camera and a millimeter wave radar. The proposed system has a parallel architecture. The algorithm of the millimeter wave radar subsystem includes density-based spatial clustering of applications with noise, particle filter, and multi-objective decision-making algorithms. The image subsystem uses the You Only Look Once v3 network and a Kalman filter to detect and track four types of objects (i.e., cars, motorcycles, bikes, and pedestrians). All radar objects are projected onto the image coordinates using a radial basis function neural network. Only the objects inside the region of interest of the on-road lane are tracked by the sensor fusion mechanism. The proposed system is evaluated in four types of weather scenarios: daytime, nighttime, rainy daytime, and rainy nighttime. The experimental results validate that the fusion strategy can effectively compensate any single-sensor failure. In the four scenarios, the average detection rate of the sensor fusion reaches 98.7%, which is higher than those of the single-sensor systems.

Original languageEnglish
Pages (from-to)271-287
Number of pages17
JournalInternational Journal of Innovative Computing, Information and Control
Volume18
Issue number1
DOIs
StatePublished - Feb 2022

Keywords

  • MMW radar
  • Particle filter
  • Sensor fusion
  • YOLO network

Fingerprint

Dive into the research topics of 'SENSOR FUSION OF CAMERA AND MMW RADAR BASED ON MACHINE LEARNING FOR VEHICLES'. Together they form a unique fingerprint.

Cite this