TY - GEN
T1 - Periodic Stacked Transformer-based Framework for Travel Time Prediction
AU - Lin, Hui Ting
AU - Dai, Hao
AU - Tseng, Vincent S.
N1 - Publisher Copyright:
© 2024 IEEE.
PY - 2024
Y1 - 2024
N2 - Travel time analysis and prediction play critical roles in developing Intelligent Transportation Systems (ITS), which have attracted significant interests from the research community. Deep learning-based methodologies have proven to be powerful tools in utilizing big data for predicting travel times. However, while most studies have focused on short-term predictions, predicting travel times over longer periods is equally important for wide applications like traffic management and route planning. Long-term prediction, which often receives less attention due to its complexity, remains a gap in current researches. To address this challenge, we propose the Periodic Stacked Transformer (PS-Transformer), a novel Transformer-based framework designed to enhance both short and long-term traffic predictions. PS-Transformer consists of two primary modules: the Segment Encoding Integration (SEI) and the Periodic Stacked Encoder-Decoder (PSED). SEI module extracts periodic patterns from traffic data, while PSED effectively captures short-term and long-term dependencies from temporal attributes. Additionally, PSED tackles error accumulation, a common issue in extended prediction periods, through its non-autoregressive decoder design. Our PS-Transformer is validated through a series of experiments on a real-world dataset, demonstrating its capability in multi-step predictions that provide forecasts over an extended duration. Empirical evaluation results show that PS-Transformer outperforms state-of-the-art methods in both short and long-term travel time predictions across various metrics, including MAE, RMSE, and SMAPE.
AB - Travel time analysis and prediction play critical roles in developing Intelligent Transportation Systems (ITS), which have attracted significant interests from the research community. Deep learning-based methodologies have proven to be powerful tools in utilizing big data for predicting travel times. However, while most studies have focused on short-term predictions, predicting travel times over longer periods is equally important for wide applications like traffic management and route planning. Long-term prediction, which often receives less attention due to its complexity, remains a gap in current researches. To address this challenge, we propose the Periodic Stacked Transformer (PS-Transformer), a novel Transformer-based framework designed to enhance both short and long-term traffic predictions. PS-Transformer consists of two primary modules: the Segment Encoding Integration (SEI) and the Periodic Stacked Encoder-Decoder (PSED). SEI module extracts periodic patterns from traffic data, while PSED effectively captures short-term and long-term dependencies from temporal attributes. Additionally, PSED tackles error accumulation, a common issue in extended prediction periods, through its non-autoregressive decoder design. Our PS-Transformer is validated through a series of experiments on a real-world dataset, demonstrating its capability in multi-step predictions that provide forecasts over an extended duration. Empirical evaluation results show that PS-Transformer outperforms state-of-the-art methods in both short and long-term travel time predictions across various metrics, including MAE, RMSE, and SMAPE.
KW - Intelligent Transportation Systems
KW - Long-term prediction
KW - Periodic traffic data
KW - Transformer
KW - Travel time prediction
UR - http://www.scopus.com/inward/record.url?scp=85204957405&partnerID=8YFLogxK
U2 - 10.1109/IJCNN60899.2024.10650659
DO - 10.1109/IJCNN60899.2024.10650659
M3 - Conference contribution
AN - SCOPUS:85204957405
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - 2024 International Joint Conference on Neural Networks, IJCNN 2024 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2024 International Joint Conference on Neural Networks, IJCNN 2024
Y2 - 30 June 2024 through 5 July 2024
ER -