TY - GEN
T1 - Microscopic Traffic Information Collection Based on a Lightweight MTMC Tracking Network
AU - Chen, Guan Wen
AU - Su, Zi Jun
AU - Ik, Tsì Uí
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Vehicle trajectory collection is critical for intelligent transportation systems and tasks such as driving behavior analysis, travel time measurement, and traffic planning. Object tracking through computer vision can be used to obtain vehicle trajectories; however, trajectory information collected from a single camera is limited because of the camera's limited field of view. In multi-target multi-camera (MTMC) tracking, multiple camera views are integrated to associate various trajectories to a single vehicle by matching the vehicle's appearance with the trajectories. These cameras might have overlapping or nonoverlapping fields of view. Trajectory information from MTMC tracking can be used for driving behavior analysis, traffic congestion estimation, and route planning. However, color tones and angles differ between cameras; thus, trajectory association is challenging. In MTMC tracking, appearance, spatial, and temporal information can be integrated to reduce identification failures. This paper proposes a trajectory association framework for travel time and traffic flow estimation. The effects of spatial and temporal information on performance were evaluated for the AI City Challenge dataset and a self-collected dataset. The F1 scores obtained for these two datasets were 0.917 and 0.897, respectively. The inclusion of spatial and temporal information improved the F1 scores by approximately 0.06-0.72. The errors for estimating travel time and vehicle behavior (turning or straight movement) were approximately 3 sand 15 %, respectively, for various camera angles.
AB - Vehicle trajectory collection is critical for intelligent transportation systems and tasks such as driving behavior analysis, travel time measurement, and traffic planning. Object tracking through computer vision can be used to obtain vehicle trajectories; however, trajectory information collected from a single camera is limited because of the camera's limited field of view. In multi-target multi-camera (MTMC) tracking, multiple camera views are integrated to associate various trajectories to a single vehicle by matching the vehicle's appearance with the trajectories. These cameras might have overlapping or nonoverlapping fields of view. Trajectory information from MTMC tracking can be used for driving behavior analysis, traffic congestion estimation, and route planning. However, color tones and angles differ between cameras; thus, trajectory association is challenging. In MTMC tracking, appearance, spatial, and temporal information can be integrated to reduce identification failures. This paper proposes a trajectory association framework for travel time and traffic flow estimation. The effects of spatial and temporal information on performance were evaluated for the AI City Challenge dataset and a self-collected dataset. The F1 scores obtained for these two datasets were 0.917 and 0.897, respectively. The inclusion of spatial and temporal information improved the F1 scores by approximately 0.06-0.72. The errors for estimating travel time and vehicle behavior (turning or straight movement) were approximately 3 sand 15 %, respectively, for various camera angles.
KW - deep learning network
KW - multi-target multi-camera (MTMC) tracking
KW - traffic flow collection
KW - travel time estimation
KW - Vehicle trajectory collection
UR - http://www.scopus.com/inward/record.url?scp=85198842983&partnerID=8YFLogxK
U2 - 10.1109/WCNC57260.2024.10570539
DO - 10.1109/WCNC57260.2024.10570539
M3 - Conference contribution
AN - SCOPUS:85198842983
T3 - IEEE Wireless Communications and Networking Conference, WCNC
BT - 2024 IEEE Wireless Communications and Networking Conference, WCNC 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 25th IEEE Wireless Communications and Networking Conference, WCNC 2024
Y2 - 21 April 2024 through 24 April 2024
ER -