跳到主要內容

臺灣博碩士論文加值系統

(18.97.9.170) 您好!臺灣時間:2024/12/06 04:13
字體大小: 字級放大   字級縮小   預設字形  
回查詢結果 :::

詳目顯示

我願授權國圖
: 
twitterline
研究生:戴俊祐
研究生(外文):Chun-Yu Tai
論文名稱(外文):Attention-based STGNN with Long-Term Dependencies for Traffic Speed Prediction
指導教授:孫敏德
指導教授(外文):Min-Te Sun
學位類別:碩士
校院名稱:國立中央大學
系所名稱:資訊工程學系在職專班
學門:工程學門
學類:電資工程學類
論文種類:學術論文
論文出版年:2024
畢業學年度:112
語文別:英文
論文頁數:48
中文關鍵詞:交通車速預測
外文關鍵詞:STGNNTransformerHuber Loss
相關次數:
  • 被引用被引用:0
  • 點閱點閱:8
  • 評分評分:
  • 下載下載:0
  • 收藏至我的研究室書目清單書目收藏:0
隨著城市地區的發展,人口總數和密度不斷增加,
導致都市化現象的形成。這使得交通網絡的規模擴大
且結構變得複雜,進而加劇了交通擁塞問題。因此,
準確預測交通速度對於城市交通網絡的管理和規劃至
關重要。隨著現實道路複雜性的提高,如何整合空間
和時間資訊以準確預測交通速度成為一個重要且具有
挑戰性的研究課題。本研究提出了一種具有注意力機
制的STGNN 模型,有效捕捉現實道路之間的複雜關
係。我們使用 Huber 損失作為訓練損失函數以提高預
測精度,並將 RMSNorm 用於替換Transformer 中的
LayerNorm,以降低計算成本。最後,我們在兩個真實
世界的交通速度資料集上對所提出的模型進行了實
驗。實證研究結果顯示,我們的方法在交通速度預測
方面優於當前領域中最先進的系統。
Urbanization, characterized by the continuous growth of population and density
in urban areas, has led to the expansion and complexity of transportation networks,
exacerbating traffic congestion. Accurate traffic speed prediction is crucial for ef-
fective traffic network management and planning. As the complexity of real-world
roads increases, how to integrate spatial and temporal information for accurate traf-
fic speed prediction has become a challenging research task. This study proposes
a novel approach by introducing a novel spatial-temporal STGNN-based model to
enhance the accuracy of traffic speed prediction. By employing an attention-based
STGNN, we effectively capture the complex relationships among road segments in
real-world scenarios. We utilize Huber loss as the training loss function to improve
prediction accuracy. Finally, we replace LayerNorm in Transformer with RMSNorm
to reduce computational costs. Using two real-world traffic speed datasets, we eval-
uated the proposed model. The experimental results demonstrate that our method
achieves superior performance compared to state-of-the-art traffic speed prediction
works.
Contents
1 Introduction 1
2 Related Work 4
2.1 Temporal Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.2 Spatial-Temporal Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.3 Spatial-Temporal Attention Models . . . . . . . . . . . . . . . . . . . . . . 6
3 Preliminary 7
3.1 Transformer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
3.1.1 Layer Normalization . . . . . . . . . . . . . . . . . . . . . . . . . . 7
3.1.2 Root Mean Square Layer Normalization . . . . . . . . . . . . . . . 8
3.2 Masked Autoencoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.3 Graph for Time Series . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.4 Spatial-Temporal Attention Wavenet . . . . . . . . . . . . . . . . . . . . . 9
4 Design 11
4.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4.2 Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4.3 Research Challenges . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
4.4 Proposed System Architecture . . . . . . . . . . . . . . . . . . . . . . . . . 13
4.4.1 Dataset Explanation . . . . . . . . . . . . . . . . . . . . . . . . . . 15
4.4.2 The Long-term Features Stage . . . . . . . . . . . . . . . . . . . . . 16
4.4.3 The Traffic Speed Prediction Stage . . . . . . . . . . . . . . . . . . 18
5 Performance 24
5.1 Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.2 Evaluation Metrics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.3 Experimental Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
5.4 Experimental Results and Analysis . . . . . . . . . . . . . . . . . . . . . . 28
5.4.1 PEMS-BAY Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . 28
5.4.2 METR-LA Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.5 Ablation Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
6 Conclusion 32
[1] Jimmy Lei Ba, Jamie Ryan Kiros, and Geoffrey E Hinton. Layer normalization.
arXiv preprint arXiv:1607.06450, 2016.
[2] Sergio Casas, Cole Gulino, Renjie Liao, and Raquel Urtasun. Spatially-aware graph
neural networks for relational behavior forecasting from sensor data. arXiv preprint
arXiv:1910.08233, 2019.
[3] Chao Chen, Karl Petty, Alexander Skabardonis, Pravin Varaiya, and Zhanfeng Jia.
Freeway performance measurement system: mining loop detector data. Transporta-
tion research record, 1748(1):96–102, 2001.
[4] Zhitang Chen, Jiayao Wen, and Yanhui Geng. Predicting future traffic using hidden
markov models. In 2016 IEEE 24th international conference on network protocols
(ICNP), pages 1–6. IEEE, 2016.
[5] Haotian Gao, Renhe Jiang, Zheng Dong, Jinliang Deng, Yuxin Ma, and Xuan
Song. Spatial-temporal-decoupled masked pre-training for spatiotemporal forecast-
ing, 2024.
[6] Alex Graves and Alex Graves. Long short-term memory. Supervised sequence labelling
with recurrent neural networks, pages 37–45, 2012.
[7] Trevor Hastie, Robert Tibshirani, and Jerome H. Friedman. The Elements of Statis-
tical Learning. Springer Series in Statistics. Springer New York Inc., New York, NY,
USA, 2001.
[8] Kaiming He, Xinlei Chen, Saining Xie, Yanghao Li, Piotr Doll´ar, and Ross Girshick.
Masked autoencoders are scalable vision learners. In Proceedings of the IEEE/CVF
conference on computer vision and pattern recognition, pages 16000–16009, 2022.
[9] Sergey Ioffe and Christian Szegedy. Batch normalization: Accelerating deep network
training by reducing internal covariate shift. In International conference on machine
learning, pages 448–456. pmlr, 2015.
[10] Hosagrahar V Jagadish, Johannes Gehrke, Alexandros Labrinidis, Yannis Papakon-
stantinou, Jignesh M Patel, Raghu Ramakrishnan, and Cyrus Shahabi. Big data and
its technical challenges. Communications of the ACM, 57(7):86–94, 2014.
[11] Eric Jang, Shixiang Gu, and Ben Poole. Categorical reparameterization with gumbel-
softmax. arXiv preprint arXiv:1611.01144, 2016.
[12] Jiawei Jiang, Chengkai Han, Wayne Xin Zhao, and Jingyuan Wang. Pdformer:
Propagation delay-aware dynamic long-range transformer for traffic flow prediction.
In Proceedings of the AAAI conference on artificial intelligence, volume 37, pages
4365–4373, 2023.
[13] Youngjoo Kim, Peng Wang, and Lyudmila Mihaylova. Structural recurrent neu-
ral network for traffic speed prediction. In ICASSP 2019-2019 IEEE International
Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 5207–5211.
IEEE, 2019.
[14] David Kitchin. Intelligent transportation systems. Transportation Planning and
Technology, 26(3):213–229, 2003.
[15] Colin Lea, Michael D Flynn, Rene Vidal, Austin Reiter, and Gregory D Hager. Tem-
poral convolutional networks for action segmentation and detection. In proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 156–165,
2017.
[16] Yaguang Li, Rose Yu, Cyrus Shahabi, and Yan Liu. Diffusion convolutional recurrent
neural network: Data-driven traffic forecasting. arXiv preprint arXiv:1707.01926,
2017.
[17] Xiaolei Ma, Jiyu Zhang, Bowen Du, Chuan Ding, and Leilei Sun. Parallel architecture
of convolutional bi-directional lstm neural networks for network-wide metro ridership
prediction. IEEE Transactions on Intelligent Transportation Systems, 20(6):2278–
2288, 2018.
[18] Xianwei Meng, Hao Fu, Liqun Peng, Guiquan Liu, Yang Yu, Zhong Wang, and
Enhong Chen. D-lstm: Short-term road traffic speed prediction model based on
gps positioning data. IEEE Transactions on Intelligent Transportation Systems,
23(3):2021–2030, 2020.
[19] Chao Shang, Jie Chen, and Jinbo Bi. Discrete graph structure learning for forecasting
multiple time series. arXiv preprint arXiv:2101.06861, 2021.
[20] Zezhi Shao, Zhao Zhang, Fei Wang, and Yongjun Xu. Pre-training enhanced spatial-
temporal graph neural network for multivariate time series forecasting. In Proceedings
of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining,
pages 1567–1577, 2022.
[21] Zhipeng Shen, Yuanming Zhang, Jiawei Lu, Jun Xu, and Gang Xiao. Seriesnet: a
generative time series forecasting model. In 2018 international joint conference on
neural networks (IJCNN), pages 1–8. IEEE, 2018.
[22] David Alexander Tedjopurnomo, Zhifeng Bao, Baihua Zheng, Farhana Murtaza
Choudhury, and Alex Kai Qin. A survey on modern deep neural network for traffic
prediction: Trends, methods and challenges. IEEE Transactions on Knowledge and
Data Engineering, 34(4):1544–1561, 2020.
[23] Chenyu Tian and Wai Kin Chan. Spatial-temporal attention wavenet: A deep learn-
ing framework for traffic prediction considering spatial-temporal dependencies. IET
Intelligent Transport Systems, 15(4):549–561, 2021.
[24] Yan Tian, Kaili Zhang, Jianyuan Li, Xianxuan Lin, and Bailin Yang. Lstm-based
traffic flow prediction with missing data. Neurocomputing, 318:297–305, 2018.
[25] Yongxue Tian and Li Pan. Predicting short-term traffic flow by long short-term
memory recurrent neural network. In 2015 IEEE international conference on smart
city/SocialCom/SustainCom (SmartCity), pages 153–158. IEEE, 2015.
[26] Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N
Gomez, Lukasz Kaiser, and Illia Polosukhin. Attention is all you need. Advances in
neural information processing systems, 30, 2017.
[27] Jingyuan Wang, Fei Hu, and Li Li. Deep bi-directional long short-term memory
model for short-term traffic flow prediction. In Neural Information Processing: 24th
International Conference, ICONIP 2017, Guangzhou, China, November 14–18, 2017,
Proceedings, Part V 24, pages 306–316. Springer, 2017.
[28] Xiaoyang Wang, Yao Ma, Yiqi Wang, Wei Jin, Xin Wang, Jiliang Tang, Caiyan Jia,
and Jian Yu. Traffic flow prediction via spatial temporal graph neural network. In
Proceedings of the web conference 2020, pages 1082–1092, 2020.
[29] Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, Xiaojun Chang, and Chengqi
Zhang. Connecting the dots: Multivariate time series forecasting with graph neural
networks. In Proceedings of the 26th ACM SIGKDD international conference on
knowledge discovery & data mining, pages 753–763, 2020.
[30] Zonghan Wu, Shirui Pan, Guodong Long, Jing Jiang, and Chengqi Zhang. Graph
wavenet for deep spatial-temporal graph modeling. arXiv preprint arXiv:1906.00121,
2019.
[31] Bing Yu, Haoteng Yin, and Zhanxing Zhu. Spatio-temporal graph convolutional
networks: A deep learning framework for traffic forecasting. arXiv preprint
arXiv:1709.04875, 2017.
[32] Biao Zhang and Rico Sennrich. Root mean square layer normalization. Advances in
Neural Information Processing Systems, 32, 2019.
[33] Jiani Zhang, Xingjian Shi, Junyuan Xie, Hao Ma, Irwin King, and Dit-Yan Yeung.
Gaan: Gated attention networks for learning on large and spatiotemporal graphs.
arXiv preprint arXiv:1803.07294, 2018.
[34] Chuanpan Zheng, Xiaoliang Fan, Cheng Wang, and Jianzhong Qi. Gman: A graph
multi-attention network for traffic prediction. In Proceedings of the AAAI conference
on artificial intelligence, volume 34, pages 1234–1241, 2020.
電子全文 電子全文(網際網路公開日期:20250717)
連結至畢業學校之論文網頁點我開啟連結
註: 此連結為研究生畢業學校所提供,不一定有電子全文可供下載,若連結有誤,請點選上方之〝勘誤回報〞功能,我們會盡快修正,謝謝!
QRCODE
 
 
 
 
 
                                                                                                                                                                                                                                                                                                                                                                                                               
第一頁 上一頁 下一頁 最後一頁 top