[1]Tao F, Cheng J, Qi Q, et al. Digital twin-driven product design, manufacturing and service with big data[J]. The International Journal of Advanced Manufacturing Technology, 2018, 94(9): 3563-3576.
[2]陶飞, 刘蔚然, 刘检华, 等. 数字孪生及其应用探索[J]. 计算机集成制造系统, 2018, 24(1): 1-18.
Tao Fei, Liu Weiran, Liu Jianhua, et al.Exploration of Digital Twin and Its Applications [J]. Computer Integrated Manufacturing Systems, 2018, 24(1): 1–18. (in Chinese)
[3]陶飞, 程颖, 程江峰, 等. 数字孪生车间信息物理融合理论与技术[J]. 计算机集成制造系统, 2017, 23(第 8): 1603.
Tao Fei, Cheng Ying, Cheng Jiangfeng, et al.
Cyber-Physical Fusion Theory and Technology for Digital-Twin Workshop [J]. Computer Integrated Manufacturing Systems, 2017, 23(8): 1603. (in Chinese)
[4]Yang W, Yang Y, Xiang W, et al. Adaptive optimization federated learning enabled digital twins in industrial IoT[J]. Journal of Industrial Information Integration, 2024, 41: 100645.
[5]Mandal S. A privacy preserving federated learning (PPFL) based cognitive digital twin (CDT) framework for smart cities[C]//Proceedings of the AAAI Conference on Artificial Intelligence. 2024, 38(21): 23399-23400.
[6]Yang W, Xiang W, Yang Y, et al. Optimizing federated learning with deep reinforcement learning for digital twin empowered industrial IoT[J]. IEEE Transactions on Industrial Informatics, 2022, 19(2): 1884-1893.
[7]Yiping G, Xinyu L, Gao L. A deep lifelong learning method for digital twin-driven defect recognition with novel classes[J]. Journal of Computing and Information Science in Engineering, 2021, 21(3): 031004.
[8]Li J, Guo S, Liang W, et al. Digital twin-enabled service provisioning in edge computing via continual learning[J]. IEEE Transactions on Mobile Computing, 2023, 23(6): 7335-7350.
[9]张东阳, 陆子轩, 刘军民, 等. 深度模型的持续学习综述: 理论, 方法和应用[J]. 电子与信息学报, 2024, 47: 1-31.
Zhang Dongyang, Lu Zixuan, Liu Junmin, et al.
A Survey on Continual Learning of Deep Models: Theory, Methods, and Applications [J]. Journal of Electronics & Information Technology, 2024, 47: 1–31. (in Chinese)
[10]Rodio A, Faticanti F, Marfoq O, et al. Federated learning under heterogeneous and correlated client availability[C]//IEEE INFOCOM 2023-IEEE Conference on Computer Communications. IEEE, 2023: 1-10.
[11]唐伦, 单贞贞, 文明艳, 等. 工业物联网中数字孪生辅助任务卸载算法 [J][J]. 电子与信息学报, 2024, 46(4): 1296-1305.
Tang Lun, Shan Zhenzhen, Wen Mingyan, et al.
Task Offloading Algorithm Assisted by Digital Twin in Industrial Internet of Things [J]. Journal of Electronics & Information Technology, 2024, 46(4): 1296–1305. (in Chinese)
[12]Wang S, Ji M. A unified analysis of federated learning with arbitrary client participation[J]. Advances in neural information processing systems, 2022, 35: 19124-19137.
[13]Yoon J, Jeong W, Lee G, et al. Federated continual learning with weighted inter-client transfer[C]//International conference on machine learning. PMLR, 2021: 12073-12086.
[14]Kirkpatrick J, Pascanu R, Rabinowitz N, et al. Overcoming catastrophic forgetting in neural networks[J]. Proceedings of the national academy of sciences, 2017, 114(13): 3521-3526.
[15]Li Z, Hoiem D. Learning without forgetting[J]. IEEE transactions on pattern analysis and machine intelligence, 2017, 40(12): 2935-2947.
[16]Psaltis A, Chatzikonstantinou C, Patrikakis C Z, et al. Fedrcil: Federated knowledge distillation for representation based contrastive incremental learning[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision. 2023: 3463-3472.
[17]Zhang J, Chen C, Zhuang W, et al. Target: Federated class-continual learning via exemplar-free distillation[C]//Proceedings of the IEEE/CVF International Conference on Computer Vision. 2023: 4782-4793.
[18]ran M T, Le T, Le X M, et al. Text-enhanced data-free approach for federated class-incremental learning[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2024: 23870-23880.
[19]Wu F, Tan A Z, Feng S, et al. Federated Class-Incremental Learning via Weighted Aggregation and Distillation[J]. IEEE Internet of Things Journal, 2025.
[20]Wu, ZY., He, TL., Sun, S. et al. Federated Class-Incremental Learning with New-Class Augmented Self-Distillation. J. Comput. Sci. Technol. 40, 1427–1437 (2025). https://doi.org/10.1007/s11390-025-5186-5
[21]Yoo M K, Park Y R. Federated class incremental learning: A pseudo feature based approach without exemplars[C]//Proceedings of the Asian Conference on Computer Vision. 2024: 488-498.
[22]Salami R, Buzzega P, Mosconi M, et al. Federated class-incremental learning with hierarchical generative prototypes[J]. arXiv preprint arXiv:2406.02447, 2024.
[23]Garg D, Sanyal D, Lee M, et al. Client Availability in Federated Learning: It Matters![C]//Proceedings of the 5th Workshop on Machine Learning and Systems. 2025: 114-121.
[24]Crawshaw M, Liu M. Federated learning under periodic client participation and heterogeneous data: A new communication-efficient algorithm and analysis[J]. Advances in Neural Information Processing Systems, 2024, 37: 8240-8299.
[25]Ren L, Dong J, Huang D, et al. Digital twin robotic system with continuous learning for grasp detection in variable scenes[J]. IEEE Transactions on Industrial Electronics, 2023, 71(7): 7653-7663.
[26]Jafari M, Kavousi-Fard A, Chen T, et al. A review on digital twin technology in smart grid, transportation system and smart city: Challenges and future[J]. IEEe Access, 2023, 11: 17471-17484.
[27]White G, Zink A, Codecá L, et al. A digital twin smart city for citizen feedback[J]. Cities, 2021, 110: 103064.
[28]Sun W, Lei S, Wang L, et al. Adaptive federated learning and digital twin for industrial internet of things[J]. IEEE Transactions on Industrial Informatics, 2020, 17(8): 5605-5614.
[29]Xia Y, Chen Y, Zhao Y, et al. Fcllm-dt: Enpowering federated continual learning with large language models for digital twin-based industrial iot[J]. IEEE Internet of Things Journal, 2024.
[30]Gao Q, Zhao C, Ghanem B, et al. R-dfcil: Relation-guided representation learning for data-free class incremental learning[C]//European Conference on Computer Vision. Cham: Springer Nature Switzerland, 2022: 423-439.
[31]Gao X, Yang X, Yu H, et al. Fedprok: Trustworthy federated class-incremental learning via prototypical feature knowledge transfer[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2024: 4205-4214.
[32]Chaudhry A, Khan N, Dokania P, et al. Continual learning in low-rank orthogonal subspaces[J]. Advances in Neural Information Processing Systems, 2020, 33: 9900-9911.
[33]Parisi G I, Kemker R, Part J L, et al. Continual lifelong learning with neural networks: A review[J]. Neural networks, 2019, 113: 54-71.
[34]Kim G, Kim J, Han B. Communication-efficient federated learning with accelerated client gradient[C]//Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2024: 12385-12394.
[35]Stripelis D, Gupta U, Steeg G V, et al. Federated progressive sparsification (purge, merge, tune)+[J]. arXiv preprint arXiv:2204.12430, 2022.
[36]Krizhevsky A, Hinton G. Learning multiple layers of features from tiny images[J]. 2009.
[37]Deng J, Dong W, Socher R, et al. Imagenet: A large-scale hierarchical image database[C]//2009 IEEE conference on computer vision and pattern recognition. Ieee, 2009: 248-255.
[38]Krause J, Stark M, Deng J, et al. 3d object representations for fine-grained categorization[C]//Proceedings of the IEEE international conference on computer vision workshops. 2013: 554-561.
|