| 1 |
梁宏涛, 刘硕, 杜军威, 等. 深度学习应用于时序预测研究综述. 计算机科学与探索, 2023, 17 (6): 1285- 1300.
|
|
LIANG H T , LIU S , DU J W , et al. Review of deep learning applied to time series prediction. Journal of Frontiers of Computer Science and Technology, 2023, 17 (6): 1285- 1300.
|
| 2 |
王竟成, 张勇, 胡永利, 等. 基于图卷积网络的交通预测综述. 北京工业大学学报, 2021, 47 (8): 954- 970.
|
|
WANG J C , ZHANG Y , HU Y L , et al. Survey on graph convolutional neural network-based traffic prediction. Journal of Beijing University of Technology, 2021, 47 (8): 954- 970.
|
| 3 |
|
| 4 |
TORRES J F , HADJOUT D , SEBAA A , et al. Deep learning for time series forecasting: a survey. Big Data, 2021, 9 (1): 3- 21.
doi: 10.1089/big.2020.0159
|
| 5 |
SALINAS D , FLUNKERT V , GASTHAUS J , et al. DeepAR: probabilistic forecasting with autoregressive recurrent networks. International Journal of Forecasting, 2020, 36 (3): 1181- 1191.
doi: 10.1016/j.ijforecast.2019.07.001
|
| 6 |
MAKRIDAKIS S , HIBON M . ARMA models and the box-jenkins methodology. Journal of Forecasting, 1997, 16 (3): 147- 163.
doi: 10.1002/(SICI)1099-131X(199705)16:3<147::AID-FOR652>3.0.CO;2-X
|
| 7 |
|
| 8 |
HOCHREITER S , SCHMIDHUBER J . Long short-term memory. Neural Computation, 1997, 9 (8): 1735- 1780.
doi: 10.1162/neco.1997.9.8.1735
|
| 9 |
DEY R, SALEM F M. Gate-variants of Gated Recurrent Unit (GRU) neural networks[C]//Proceedings of the IEEE 60th International Midwest Symposium on Circuits and Systems (MWSCAS). Washington D.C., USA: IEEE Press, 2017: 1597-1600.
|
| 10 |
ZHOU H Y, ZHANG S H, PENG J Q, et al. Informer: beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2021: 11106-11115.
|
| 11 |
ZENG A L, CHEN M X, ZHANG L, et al. Are Transformers effective for time series forecasting?[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Palo Alto, USA: AAAI Press, 2023: 11121-11128.
|
| 12 |
LIU M , ZENG A , CHEN M , et al. SCINet: time series modeling and forecasting with sample convolution and interaction. Advances in Neural Information Processing Systems, 2021, 35, 5816- 5828.
|
| 13 |
WU N, GREEN B, BEN X, et al. Deep Transformer models for time series forecasting: the influenza prevalence case[EB/OL]. [2024-03-10]. https://arxiv.org/abs/2001.08317.
|
| 14 |
LIN Y , KOPRINSKA I , RANA M . SpringNet: Transformer and spring DTW for time series forecasting. Berlin, Germany: Springer, 2020.
|
| 15 |
CHEN M H, PENG H W, FU J L, et al. Autoformer: searching Transformers for visual recognition[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV). Washington D.C., USA: IEEE Press, 2021: 12250-12260.
|
| 16 |
ZHOU T, MA Z, WEN Q, et al. Fedformer: frequency enhanced decomposed Transformer for long-term series forecasting[C]//Proceedings of International Conference on Machine Learning. Washington D.C., USA: IEEE Press, 2022: 27268-27286.
|
| 17 |
|
| 18 |
LIU Y, WU H, WANG J, et al. Non-stationary Transformers: exploring the stationarity in time series forecasting[C]//Proceedings of the 36th International Conference on Neural Information Processing System. New York, USA: ACM Press, 2022: 9881-9893.
|
| 19 |
GAO J, HU W, CHEN Y. Client: cross-variable linear integrated enhanced Transformer for multivariate long-term time series forecasting[EB/OL].[2024-03-10]. https://arxiv.org/abs/2305.18838.
|
| 20 |
CAO D, WANG Y, DUAN J, et al. Spectral temporal graph neural network for multivariate time-series forecasting[EB/OL].[2024-03-10]. https://arxiv.org/abs/2103.07719.
|
| 21 |
杨柯, 范世东. 基于长短期记忆网络时序数据趋势预测及应用. 推进技术, 2021, 42 (3): 675- 682.
|
|
YANG K , FAN S D . Long short-term memory network based method and its application in time-series data trend prediction. Journal of Propulsion Technology, 2021, 42 (3): 675- 682.
|
| 22 |
张锋, 常会友. 使用BP神经网络缓解协同过滤推荐算法的稀疏性问题. 计算机研究与发展, 2006, 43 (4): 667- 672.
|
|
ZHANG F , CHANG H Y . Employing BP neural networks to alleviate the sparsity issue in collaborative filtering recommendation algorithms. Journal of Computer Research and Development, 2006, 43 (4): 667- 672.
|
| 23 |
|
| 24 |
ZHANG X, JIN X, GOPALSWAMY K, et al. First de-trend then attend: rethinking attention for time-series forecasting[EB/OL].[2024-03-10]. https://arxiv.org/abs/2212.08151.
|
| 25 |
NIE Y, NGUYEN N H, SINTHONG P, et al. A time series is worth 64 words: long-term forecasting with Transformers[EB/OL].[2024-03-10]. https://arxiv.org/abs/2211.14730.
|
| 26 |
|