[1] LI, W X, LAW K L E. Deep learning models for time se
ries forecasting: a review [J]. IEEE Access, 2024, 12:
92306-92327.<br/>
[2] 宋凌云,马卓源,李战怀,等.面向金融风险预测的时序图
神经神经网络综述 [J]. 软件学报 , 2024, 35(08):
3897-3922.
SONG L Y, MA Z Y, LI Z H, et al. Review on temporal
graph neural networks for financial risk prediction[J].
Journal of Software, 2024, 35(08): 3897-3922.<br/>
[3] WU Z H, PAN S R, CHEN F W, et al. A comprehensive
survey on graph neural networks [J]. IEEE transactions
on neural networks and learning systems, 2020, 32(1):
4-24.<br/>
[4] WANG R J, MOU S, WANG X, et al. Graph structure es
timation neural networks [C]//Proceedings of the Web
Conference 2021. New York, NY: ACM, 2021: 342-353.<br/>
[5] 邹慧琪,史彬泽,宋凌云,等.基于图神经网络的复杂时空
数据挖掘方法综述[J/OL]. 软件学报, 1-33 (2025-01-16)
[2025-02-16].https://doi.org/10.13328/j.cnki.jos.007275.
ZOU H Q, SHI B Z, SONG L Y, et al. Survey on com
plex spatio-temporal data mining methods based on
graph neural networks[J/OL]. Journal of Software,1-33
(2025-01-16)[2025-02-16].https://doi.org/10.13328/j.cnk
i.jos.007275.<br/>
[6] YU B, YIN H T, ZHU Z X. Spatio-temporal graph con
volutional networks: A deep learning framework for traf
fic forecasting [J].arXiv:1709.04875, 2017.
[7] SONG C, LIN Y F, GUO S N, et al. Spatial-temporal
synchronous graph convolutional networks: a new
framework for spatial-temporal network data forecasting
[J]. Proceedings of the AAAI Conference on Artificial
Intelligence, 2020, 34(1): 914-921.
[8] JIN M, KOH H Y, WEN Q S, et al. A Survey on graph
neural networks for time series: forecasting, classifica
tion, imputation, and anomaly detection [J]. IEEE Trans
actions on Pattern Analysis and Machine Intelligence.
2024, 46(12): 10466-10485.
[9] WU Z H, PAN S R, LONG G D, et al. Graph wavenet for
deep spatial-temporal graph modeling [C]//Proceedings
of the 28th International Joint Conference on Artificial
Intelligence. Menlo Park, CA: AAAI, 2019: 1907-1913.
[10] BAI L, YAO L N, CAN L, et al. Adaptive graph convolu
tional recurrent network for traffic forecasting
[C]//Proceedings of the 34th International Conference on
Neural Information Processing Systems. New York, NY:
Curran Associates Inc., 2020: 17804-17815.
[11] WU Z H, PAN S R, LONG G D, et al. Connecting the
dots: multivariate time series forecasting with graph
neural networks [C]//Proceedings of the 26th ACM
SIGKDD International Conference on Knowledge Dis
covery and Data Mining. New York, NY: ACM, 2020:
753-763.
[12] VASWANI A, SHAZEER N, PAEMAR N, et al. Atten
tion is all you need [C]//Proceedings of the 31st Interna
tional Conference on Neural Information Processing
Systems. New York, NY: Curran Associates Inc., 2017:
6000-6010.
[13] 毛远宏,孙琛琛,徐鲁豫,等.基于深度学习的时间序列预
测方法综述[J]. 微电子学与计算机, 2023, 40(4): 8-17.
MAO Y H, SUN C C, XU L Y, et al. A survey of time se
ries forecasting methods based on deep learning [J]. Mi
croelectronics & Computer, 2023, 40(4): 8-17.
[14] 卢苡锋,王霄.基于二次分解和 IDBO-DABiLSTM 的短
期风电功率预测模型[J]. 计算机工程, 2024, 50(12):
99-109.
LU Y F, WANG X. Short-term wind power prediction
model based on secondary decomposition and ID
BO-DABiLSTM[J]. Computer Engineering, 2024, 50(12):
99-109.
[15] QIN Y, SONG D J, CHENG H F, et al. A dual-stage at
tention-based recurrent neural network for time series
prediction [C]//Proceedings of the 26th International
Joint Conference on Artificial Intelligence. Menlo Park,
CA: AAAI Press, 2017: 2627-2633.
[16] LIU Z M, WANG Y X, VAIDYA S, et al. KAN: Kolmo
gorov-Arnold Networks [J]. arXiv: 2404.19756, 2024.
[17] HAN X, ZHANG X F, WU Y L, et al. KAN4TSF: Are
KAN and KAN-based models Effective for Time Series
Forecasting? [J]. arXiv: 2404.11306, 2024.
[18] DONG C, ZHENG L W, CHEN W T. Kolmogo
rov-Arnold Networks (KAN) for Time Series Classifica
tion and Robust Analysis [C]//Proceedings of the 20th
International Conference on Advanced Data Mining and
Applications. Berlin, German: Springer, 2024: 342-355.
[19] GENET R, INZIRILLO H. Tkan: Temporal kolmogorov
arnold networks [J].arXiv:2405.07344, 2024.
[20] 刘灿锋,孙浩,东辉.结合 Transformer 与 Kolmogorov
Arnold 网络的分子扩增时序预测研究[J]. 图学学报,
2024, 45(6): 1256-1265.
LIU C F, SUN H, DONG H. Molecular amplification
time series prediction research combining Transformer
with Kolmogorov-Arnold network [J]. Journal of
Graphics, 2024, 45(6): 1256-1265.
[21] DOSOVITSKIY A, BEYER L, KOLESNIKOV A, et al.
An image is worth 16x16 words: transformers for image
recognition at scale [J]. arXiv: 2010.11929, 2021.
[22] MIKOLOV T, CORRADO G S, CHEN K, et al. Efficient
Estimation of Word Representations in Vector Space [J].
arXiv: 1301.3781, 2013.
[23] CAI W L, LIANG Y X, LIU X G, et al. Msgnet: learning
multi-scale inter-series correlations for multivariate time
series forecasting [C]//Proceedings of the 37th AAAI
Conference on Artificial Intelligence. Menlo Park, CA:
AAAI, 2024: 11141-11149.
[24] NIE Y Q, NGUYEN N H, SINTHONG , et al. A time se
ries is worth 64 words: long-term forecasting with trans
formers [J]. arXiv: 2211.14730, 2023.
[25] ZENG A L, CHEN M X, ZHANG L, et al. Are trans
formers effective for time series forecasting?
[C]//Proceedings of the 37th AAAI conference on artifi
cial intelligence. Menlo Park, CA: AAAI, 2023:
11121-11128.
[26] WU H, XU J, WANG J, et al. Autoformer: decomposition
transformers with auto-correlation for long-term series
forecasting [C]// Proceedings of the 35th International
Conference on Neural Information Processing Systems.
New York, NY: Curran Associates Inc., 2021:
22419-22430.
[27] ZHOU H, ZHANG S, PENG J, et al. Informer: Beyond
efficient transformer for long sequence time-series fore
casting [C]//Proceedings of the 35th AAAI Conference
on Artificial Intelligence. Menlo Park, CA: AAAI, 2021:
11106-11115.