[1]SHARADGA H, HAJIMIRZA S, BALOG R S. Time series forecasting of solar power generation for large-scale photovoltaic plants[J]. Renewable Energy, 2020, 150: 797-807.
[2]LAZCANO A, HERRERA P J, MONGE M. A combined model based on recurrent neural networks and graph convolutional networks for financial time series forecasting[J]. Mathematics, 2023, 11(1): 224.
[3]刘颉羲, 陈松灿.基于混合门单元的非平稳时间序列预测[J].计算机研究与发展, 2019, 56(8): 1642-1651.
LIU Jiexi, CHEN Songcan. Non-stationary multivariate time series prediction with mix gated unit[J]. Journal of Computer Research and Development, 2019, 56(8): 1642-1651.
[4]吴宇轩, 虞慧群, 范贵生. 基于误差补偿的多模态协同交通流预测模型[J]. 电子学报, 2024, 52(8): 2878-2890.
WU Yuxuan, YU Huiqun, FAN Guisheng, Multimodal cooperative traffic flow prediction model based on error compensation[J]. Acta Electionica Sinica, 2024, 52(8): 2878-2890.
[5]FANG Tingting, LAHDELMA R. Evaluation of a multiple linear regression model and SARIMA model in forecasting heat demand for district heating system[J]. Applied energy, 2016, 179: 544-552.
[6]ELMAN J L. Finding structure in time[J]. Cognitive science, 1990, 14(2): 179-211.
[7]王寅超, 陈博, 俞俊霞等.基于改进CNN-GRU模型的短期电力负荷预测研究[J]. 计算机工程, 2024, 1(1):1-9.
WANG Yinchao, CHEN Bo, YU Junxia, et al. Research on short-term electricity load forecasting based on improved CNN-GRU model[J]. Computer Engineering, 2024, 1(1):1-9.
[8]任烈弘,黄铝文,田旭,等.基于DFT的频率敏感双分支Transformer多变量长时间序列预测方法[J].计算机应用, 2024, 44(9):2739-2746.
REN Liehong, HUANG Lyuwen, TIAN Xu, et al. Multivariate long-term series forecasting method with DFT-based frequency-sensitive dual-branch Transformer[J]. Journal of Computer Applications, 2024, 44(9):2739-2746.
[9]KITAEV N, KAISER Ł, LEVSKAYA A. Reformer: The efficient transformer[J]. arXiv preprint arXiv:2001.04451, 2020.
[10]ZHOU Haoyi, ZHANG Shanghang, PENG Jieqi, et al. Informer: Beyond efficient transformer for long sequence time-series forecasting[C]//Proceedings of the AAAI conference on artificial intelligence. Menlo Park:2021, 35(12): 11106-11115.
[11]LIU Shizhan, YU Hang, LIAO Cong, et al. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting[C/OL]//International conference on learning representations. San Juan, CA: ICLR, 2021 [2024-06-13]. https://openreview.net/pdf?id=0EXmFzUn5I
[12]LIU Minhao, ZENG Ailing, CHEN Muxi, et al. Scinet: Time series modeling and forecasting with sample convolution and interaction[J]. Advances in Neural Information Processing Systems, 2022, 35: 5816-5828.
[13]DAI Tao, WU Beiliang, LIU Peiyuan, et al. Periodicity decoupling framework for long-term series forecasting[C]//The Twelfth International Conference on Learning Representations. San Juan, CA: ICLR, 2024,1:1-12
[14]WU Haixu, XU Jiehui, WANG Jianmin, et al. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting[J].Advances in neural information processing systems, 2021, 34: 22419-22430.
[15]ZHANG Yunhao, YAN Junchi. Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting[C/OL]//The eleventh international conference on learning representations. San Juan, CA: ICLR, 2023[2024-06-13].https://openreview.net/pdf?id=vSVLM2j9eie
[16]TANG Peiwang, ZHANG Wweitai. Unlocking the Power of Patch: Patch-Based MLP for Long-Term Time Series Forecasting[C]//Proceedings of the AAAI Conference on Artificial Intelligence. Menlo Park: AAAI, 2025: 12640-12648.
[17]LEWANDOWSKY S, MURDOCK Jr B B. Memory for serial order[J]. Psychological Review, 1989, 96(1): 25.
[18]HOCHREITER S, SCHMIDHUBER J. Long short-term memory[J]. Neural computation, 1997, 9(8):1735-1780.
[19]GRAVES A. Long short-term memory[J]. Supervised sequence labelling with recurrent neural networks, 2012: 37-45.
[20]CHO K, VAN MERRIËNBOER B, GULCEHRE C, et al. Learning phrase representations using RNN encoder-decoder for statistical machine translation[C]. //Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, ACL: 2014: 1724-1734.
[21]QIN Yao, SONG Dongjin, CHENG Haifeng, et al. A dual-stage attention-based recurrent neural network for time series prediction[C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence. Freiburg: IJCAI: 2017: 2627-2633.
[22]SMYL S. A hybrid method of exponential smoothing and recurrent neural networks for time series forecasting[J]. International Journal of Forecasting, 2020, 36(1): 75-85.
[23]SALINAS D, FLUNKERT V, GASTHAUS J, et al. DeepAR: Probabilistic forecasting with autoregressive recurrent networks[J]. International journal of forecasting, 2020, 36(3): 1181-1191.
[24]CHEN Jiansheng, KANG Xiangui, LIU Ye, et al. Median filtering forensics based on convolutional neural networks[J]. IEEE Signal Processing Letters, 2015, 22(11): 1849-1853.
[25]LEA C, FLYNN M D, VIDAL R, et al. Temporal convolutional networks for action segmentation and detection[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Piscataway, NJ: IEEE, 2017: 156-165.
[26]LIU Peiyuan, WU Beiliang, LI Naiqi, et al. WFTNet: exploiting global and local periodicity in long-term time series forecasting[C]//ICASSP 2024-2024 IEEE International Conference on Acoustics, Speech and Signal Processing. Piscataway, NJ: IEEE, 2024: 5960-5964.
[27]WANG Jing, JU Yanbing, DONG Peiwu, et al. Long-term time series forecasting by a frequency-domain enhanced temporal convolutional network with the stationary residual regularization[J]. Applied Soft Computing, 2025: 113779.
[28]HAO Jianhua, LIU Fangai. Improving long-term multivariate time series forecasting with a seasonal-trend decomposition-based 2-dimensional temporal convolution dense network[J]. Scientific Reports, 2024, 14(1): 1689.
[29]OH J, WANG Jiaxuan, WIENS J. Learning to exploit invariances in clinical time-series data using sequence transformer networks[C]//Machine learning for healthcare conference, New York :PMLR, 2018: 332-347.
[30]GRUVER N, FINZI M, QIU Shikai, et al. Large language models are zero-shot time series forecasters[J]. Advances in Neural Information Processing Systems, 2023, 36: 19622-19635.
[31]SONG Ziyang, LU Qincheng, XU Hao, et al. TimelyGPT: Extrapolatable Transformer Pre-training for Long-term Time-Series Forecasting in Healthcare[C]//Proceedings of the 15th ACM International Conference on Bioinformatics, Computational Biology and Health Informatics. New York, NY : BCB, 2024: 1-10.
[32]ZHOU Tian, NIU Peisong, SUN Liang, et al. One fits all: Power general time series analysis by pretrained lm[J]. Advances in neural information processing systems, 2023, 36: 43322-43355.
[33]REN Lei, WANG Haiteng, MO Tingyu, et al. A lightweight group transformer-based time series reduction network for edge intelligence and its application in industrial RUL prediction[J]. IEEE Transactions on Neural Networks and Learning Systems, 2024,1: 1-10
[34]BENIDIS K, RANGAPURAM S S, FLUNKERT V, et al. Deep learning for time series forecasting: tutorial and literature survey[J]. ACM Computing Surveys, 2022, 55(6): 1-36.
[35]潘立群,吴中华,洪标.基于核技巧改进的Informer模型的长序列时间序列预测方法[J].计算机科学, 2023, 50(S2): 678-683.
PAN Liqun, WU Zhonghua, HONG Biao. Prediction method of long series time series based on improved informer model with kernel technique [J]. Computer Science,2023,50(S2):678-683.
[36]WILLMOTT C J, MATSUURA K. Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance[J]. Climate research, 2005, 30(1): 79-82.
[37]吴明朗,庞振江,洪海敏,等.基于残差的分布式光伏发电功率组合预测方法[J].深圳大学学报(理工版) ,2024,41(03):293-302.
WU Minglang, PANG Zhenjiang, HONG Haimin, et al. Skip-based combined prediction method for distributed photovoltaic power generation[J]. Journal of Shenzhen University Science and Engineering,2024,41(3):293-302.
[38]DEMŠAR J. Statistical comparisons of classifiers over multiple data sets[J]. The Journal of Machine learning research, 2006, 7: 1-30.
|