[1]Peng G, Li X, Tan H. Integrating the safety control against cyber-attacks on the global in-formation in coupled map car-following mod-el under connected vehicles platoon environ-ment[J]. IEEE Transactions on Intelligent Transportation Systems, 2024.
[2]Peng G, Luo C, Zhao H, et al. Phase transit-ions of dual-lane lattice model incorporating cyber-attacks on lane change involving inflow and outflow under connected vehicles envir-onment[J]. Chaos, Solitons & Fractals, 2024, 181: 114697.
[3]Pei Z, Qi X, Zhang Y, et al. Human trajectory prediction in crowded scene using social-affinity long short-term memory[J]. Pattern Recognition, 2019, 93: 273-282.
[4]Wang X, Hu J, Wei C, et al. A novel lane-change decision-making with long-time traject-ory prediction for autonomous vehicle[J]. IEEE Access, 2023, 11: 137437-137449.
[5]Qie T, Wang W, Yang C, et al. A self-trajectory prediction approach for autonomous vehi-cles using distributed decouple LSTM[J]. IEEE Transactions on Industrial Informatics, 2024, 20(4): 6708-6717.
[6]Zyner A, Worrall S, Nebot E. Naturalistic driver intention and path prediction using recurrent neural networks[J]. IEEE transactions on intelligent transportation systems, 2019, 21(4): 1584-1594.
[7]Yang B, Fan F, Ni R, et al. A multi-task le-arning network with a collision-aware graph transformer for traffic-agents trajectory predic-tion[J]. IEEE Transactions on Intelligent Tran-sportation Systems, 2024, 25(7): 6677-6690.
[8]Yang B, Lu Y, Wan R, et al. Meta-IRLSOT++: A meta-inverse reinforcement learning met-hod for fast adaptation of trajectory predictio-n networks[J]. Expert Systems with Applications, 2024, 240: 122499.
[9]Yang B, Yan K, Hu C, et al. Dynamic subclass-balancing contrastive learning for long-tail pedestrian trajectory prediction with progressive refinement[J]. IEEE Transactions on Automation Science and Engineering, 2024.
[10]Gao K, Li X, Chen B, et al. Dual transformer based prediction for lane change intentions and trajectories in mixed traffic environment[J]. IEEE Transactions on Intelligent Transpo-rtation Systems, 2023, 24(6): 6203-6216.
[11]Che C, Luo S, Zong W, et al. Multimodal adversarial informer for highway vehicle lane-changing trajectory prediction[J]. Physica A: Statistical Mechanics and its Applications, 2024, 654: 130158.
[12]陈文强,张严,韩晓宇,等.基于Transformer与图注意力网络的车辆轨迹预测模型[J/OL].吉林大学学报(工学版),1-9[2025-06-24].
Chen Wen-qiang, Zhang Yan, Han Xiao-yu, et al. Vehic-le trajectory prediction model based on Transformer and graph attention network[J/OL].Jour-nal of Jilin University (Engineering and Tech-nology Edition), 1-9[2025-06-24].
[13]王庆荣,郝福乐,朱昌锋,等.基于多特征融合的车辆轨迹预测研究[J/OL].计算机工程,1-14[2025-08-20].
Wang Qing-rong, Hao Fu-le, Zhu Chang-feng, et al. Research on Vehicle Trajectory Prediction Based on Multi-Feature Fusion[J/OL]. Computer Engineering, 1-14[2025-08-20].
[14]田彦涛,许富强,王凯歌,等.考虑周车信息的自车期望轨迹预测[J].吉林大学学报(工学版),2023,53(03):674-681.
Tian Yan-tao, Xu Fu-qiang, Wang Kai-ge, et al. Ego-vehicle expected trajectory prediction consideri-ng surrounding vehicles' information[J]. Journ-al of Jilin University (Engineering and Techn-ology Edition), 2023, 53(03): 674-681.
[15]黄玲,崔躜,游峰,等.适用于多车交互场景的车辆轨迹预测模型[J].吉林大学学报(工学版),2024,54(05):1188-1195.
Huang Ling, Cui Zuan, You Feng, et al. Vehicle trajectory prediction model for multi-vehicle interac-tion scenarios[J]. Journal of Jilin University (Engineering and Technology Edition), 2024, 54(05): 1188-1195.
[16]方华珍,刘立,顾青,等.基于轨迹预测和极限梯度提升的驾驶意图识别[J].吉林大学学报(工学版),2025,55(02):623-630.
Fang Hua-zhen, Liu Li, Gu Qing, et al. Driving intention recognition based on trajectory prediction and extreme gradient boosting[J]. Journal of Jilin University (Engineering and Technol-ogy Edition), 2025, 55(02): 623-630.
[17]Che C, Luo S, Zong W, et al. Multimodal adversarial informer for highway vehicle lane-changing trajectory prediction[J]. Physica A: Statistical Mechanics and its Applications, 2024, 654: 130158.
[18]辛嵩,刘晗,王可,等.基于概率融合的车辆意图识别与轨迹预测方法[J].交通运输系统工程与信息,2025,25(02):128-137.
Xin Song, Liu Han, Wang Ke, et al. Vehicle intention recognition and trajectory prediction method based on probability fusion[J]. Journal of Transportation Systems Engineering and Information Technology, 2025, 25(02): 128-137.
[19]Wang J, Liu K, Li H. LSTM-based graph attention network for vehicle trajectory prediction[J]. Computer Networks, 2024, 248: 110477.
[20]Brody S, Alon U, Yahav E. How attentive are graph attention networks?[J]. arxiv preprint arxiv:2105.14491, 2021.
[21]Pan C, Dai Z, Zhang Y, et al. An approach for accurately extracting vehicle trajectory from aerial videos based on computer vision[J]. Measurement, 2025, 242: 116212.
[22]Thiemann C, Treiber M, Kesting A. Estimating acceleration and lane-changing dynamics from next generation simulation trajectory data[J]. Transportation Research Record, 2008, 2088(1): 90-101.
[23]闫建红,刘芝妍,王震.融合时空注意力机制的多尺度卷积车辆轨迹预测[J/OL].计算机工程,1-10[2025-06-19].
Yan Jian-hong, Liu Zhi-yan, Wang Zhen. Multi-scale con-volutional vehicle trajectory prediction integra-ting spatio-temporal attention me-chanism[J/OL]. Computer Engineering, 1-10[2025-06-19].
[24]高凯,刘欣宇,胡林,等.基于稀疏注意力的时空交互车辆轨迹预测[J].汽车工程,2025,47(05):809-819.DOI:10.19562/j.chinasae.qcgc.2025.05.002.
Gao Kai, Liu Xinyu, Hu Lin, et al. Vehicle Trajectory Prediction Based on Sparse Attention for Spatiotemporal Interaction [J]. Automotive Engineering, 2025, 47(05): 809–819. DOI:10.19562/j.chinasae.qcgc.2025.05.002.
[25]田彦涛,黄兴,卢辉遒,等.基于注意力与深度交互的周车多模态行为轨迹预测[J].吉林大学学报(工学版),2023,53(05):1474-1480.DOI:10.13229/j.cnki.jdxbgxb.20210904.
Tian Yantao, Huang Xing, Lu Huiqiu, et al. Multimodal Behavior Trajectory Prediction of Surrounding Vehicles Based on Attention and Deep Interaction [J]. Journal of Jilin University (Engineering and Technology Edition), 2023, 53(05): 1474–1480. DOI:10.13229/j.cnki.jdxbgxb.20210904.
[26]宋秀兰,董兆航,单杭冠,等.基于时空融合的多头注意力车辆轨迹预测[J].浙江大学学报(工学版),2023,57(08):1636-1643.DOI:CNKI:SUN:ZDZC.0.2023-08-016.
Song Xiulan, Dong Zhaohang, Shan Hangguan, et al. Vehicle Trajectory Prediction Based on Spatiotemporal Fusion and Multi-Head Attention [J]. Journal of Zhejiang University (Engineering Science Edition), 2023, 57(08): 1636–1643. DOI:CNKI:SUN:ZDZC.0.2023-08-016.
|