1 |
VARGHESE J, VARGHEESE S K, PETER E. A study on artificial intelligence of things: techniques and applications. A Journal of Composition Theory, 2020, 8(3): 888- 896.
|
2 |
|
3 |
LI T, SAHU A K, TALWALKAR A, et al. Federated learning: challenges, methods, and future directions. IEEE Signal Processing Magazine, 2020, 37(3): 50- 60.
doi: 10.1109/MSP.2020.2975749
|
4 |
KAIROUZ P, MCMAHAN H B, AVENT B, et al. Advances and open problems in federated learning. Foundations and Trends in Machine Learning, 2021, 14(1/2): 1- 210.
|
5 |
|
6 |
|
7 |
|
8 |
|
9 |
|
10 |
|
11 |
ZHANG X W, HONG M Y, DHOPLE S, et al. FedPD: a federated learning framework with optimal rates and adaptivity to non-IID data[EB/OL]. [2023-01-02]. https://arxiv.org/abs/2005.11418.pdf.
|
12 |
|
13 |
柏财通, 崔翛龙, 李爱. 基于本地蒸馏联邦学习的鲁棒语音识别技术. 计算机工程, 2022, 48(10): 103- 109.
URL
|
|
BAI C T, CUI X L, LI A. Robust speech recognition technology based on federal learning with local distillation. Computer Engineering, 2022, 48(10): 103- 109.
URL
|
14 |
王树芬, 张哲, 马士尧, 等. 一种鲁棒的半监督联邦学习系统. 计算机工程, 2022, 48(6): 107-114, 123.
URL
|
|
WANG S F, ZHANG Z, MA S Y, et al. A robust semi-supervised federated learning system. Computer Engineering, 2022, 48(6): 107-114, 123.
URL
|
15 |
SUTSKEVER I, MARTENS J, DAHL G, et al. On the importance of initialization and momentum in deep learning[C]//Proceedings of the 30th International Conference on Machine Learning. New York, USA: ACM Press, 2013: 1-9.
|
16 |
KRIZHEVSKY A, SUTSKEVER I, HINTON G E. ImageNet classification with deep convolutional neural networks. Communications of the ACM, 2017, 60(6): 84- 90.
doi: 10.1145/3065386
|
17 |
OZFATURA E, OZFATURA K, GUNDUZ D. FedADC: accelerated federated learning with drift control[C]//Proceedings of 2021 IEEE International Symposium on Information Theory. Washington D. C., USA: IEEE Press, 2021: 467-472.
|
18 |
YANG Z J, BAO W, YUAN D, et al. Federated learning with Nesterov accelerated gradient. IEEE Transactions on Parallel and Distributed Systems, 2022, 33(12): 4863- 4873.
doi: 10.1109/TPDS.2022.3206480
|
19 |
LIU W, CHEN L, CHEN Y F, et al. Accelerating federated learning via momentum gradient descent. IEEE Transactions on Parallel and Distributed Systems, 2020, 31(8): 1754- 1766.
doi: 10.1109/TPDS.2020.2975189
|
20 |
|
21 |
|
22 |
|
23 |
KHANDURI P, SHARMA P, YANG H, et al. STEM: a stochastic two-sided momentum algorithm achieving near-optimal sample and communication complexities for federated learning. Advances in Neural Information Processing Systems, 2021, 34(1): 6050- 6061.
|
24 |
|
25 |
XU A, HUANG H. Coordinating momenta for cross-silo federated learning. Proceedings of the AAAI Conference on Artificial Intelligence, 2022, 36(8): 8735- 8743.
doi: 10.1609/aaai.v36i8.20853
|
26 |
DEVOLDER O, GLINEUR F, NESTEROV Y. First-order methods of smooth convex optimization with inexact oracle. Mathematical Programming, 2014, 146(1): 37- 75.
|
27 |
LESSARD L, RECHT B, PACKARD A. Analysis and design of optimization algorithms via integral quadratic constraints. SIAM Journal on Optimization, 2016, 26(1): 57- 95.
doi: 10.1137/15M1009597
|
28 |
POLYAK B T. Some methods of speeding up the convergence of iteration methods. USSR Computational Mathematics and Mathematical Physics, 1964, 4(5): 1- 17.
|
29 |
|
30 |
WANG S Q, TUOR T, SALONIDIS T, et al. Adaptive federated learning in resource constrained edge computing systems. IEEE Journal on Selected Areas in Communications, 2019, 37(6): 1205- 1221.
|
31 |
|
32 |
|
33 |
HE K M, ZHANG X Y, REN S Q, et al. Deep residual learning for image recognition[C]//Proceedings of IEEE Conference on Computer Vision and Pattern Recognition. Washington D. C., USA: IEEE Press, 2016: 770-778.
|
34 |
NESTEROV Y. A method for unconstrained convex minimization problem with the rate of convergence O(1/k^2). Doklady Akademii Nauk SSSR, 1983, 269(3): 543- 547.
|