[1] 董裕民, 张静, 谢昌佐, 等. 云边端架构下边缘智能计算关键问题综述: 计算优化与计算卸载[J]. 电子与信息学报, 2024, 46(3): 765-776.
Dong Yumin, Zhang Jing, Xie Changzuo, et al. A survey of key issues in edge intelligence computing under cloud-edge-end architecture: computation optimization and computation offloading[J]. Journal of Electronics & Information Technology, 2024, 46(3): 765–776.
[2] 常禧龙, 梁琨, 李文涛. 深度学习优化器进展综述[J]. Journal of Computer Engineering & Applications, 2024, 60(7).
Chang Xilong, Liang Kun, Li Wentao. A review of advances in deep learning optimizers[J]. Journal of Computer Engineering & Applications, 2024, 60(7).
[3] 王其朝, 金光淑, 李庆, 等. 工业边缘计算研究现状与展望[J]. 信息与控制, 2021, 50(3): 257-274.
Wang Qichao, Jin Guangshu, Li Qing, et al. Research status and prospects of industrial edge computing[J]. Information and Control, 2021, 50(3): 257–274.
[4] Bischl B, Binder M, Lang M, et al. Hyperparameter optimization: Foundations, algorithms, best practices, and open challenges[J]. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 2023, 13(2): e1484.
[5] Chiari M, De Pascalis M, Pradella M. Static analysis of infrastructure as code: a survey[C]//2022 IEEE 19th International Conference on Software Architecture Companion (ICSA-C). IEEE, 2022: 218-225.
[6] Bergstra J, Bengio Y. Random search for hyper-parameter optimization[J]. The journal of machine learning research, 2012, 13(1): 281-305.
[7] Garnett R. Bayesian optimization[M]. Cambridge University Press, 2023.
[8] Katoch S, Chauhan S S, Kumar V. A review on genetic algorithm: past, present, and future[J]. Multimedia tools and applications, 2021, 80(5): 8091-8126.
[9] Li X, Zhang G, Zheng W. SmartTuning: selecting hyper-parameters of a ConvNet system for fast training and small working memory[J]. IEEE Transactions on Parallel and Distributed Systems, 2020, 32(7): 1690-1701.
[10] Jamieson K, Talwalkar A. Non-stochastic best arm identification and hyperparameter optimization[C]//Artificial intelligence and statistics. PMLR, 2016: 240-248.
[11] Li L, Jamieson K, DeSalvo G, et al. Hyperband: A novel bandit-based approach to hyperparameter optimization[J]. Journal of Machine Learning Research, 2018, 18(185): 1-52.
[12] Falkner S, Klein A, Hutter F. Practical hyperparameter optimization for deep learning[J]. 2018.
[13] Lu Z, Chiang C K, Sha F. Hyper-parameter tuning under a budget constraint[J]. arXiv preprint arXiv:1902.00532, 2019.
[14] Wu Q, Wang C, Huang S. Frugal optimization for cost-related hyperparameters[C]//Proceedings of the AAAI Conference on Artificial Intelligence. 2021, 35(12): 10347-10354.
[15] Wang C, Liu X, Awadallah A H. Cost-effective hyperparameter optimization for large language model generation inference[C]//International Conference on Automated Machine Learning. PMLR, 2023: 21/1-17.
[16] Zhang Q, Chen M, Bukharin A, et al. Adaptive budget allocation for parameter-efficient fine-tuning[C]//The Eleventh International Conference on Learning Representations. 2023.
[17] Zhang X, Wu H, Chang Z, et al. Restune: Resource oriented tuning boosted by meta-learning for cloud databases[C]//Proceedings of the 2021 international conference on management of data. 2021: 2102-2114.
[18] Zhang H, Zhang M, Liu X, et al. Fedtune: Automatic tuning of federated learning hyper-parameters from system perspective[C]//MILCOM 2022-2022 IEEE Military Communications Conference (MILCOM). IEEE, 2022: 478-483.
[19] Zhang X, Fu L, Zhang H, et al. Federated Learning Hyper-Parameter Tuning for Edge Computing[M]//Edge Computing-Technology, Management and Integration. IntechOpen, 2023.
[20] Costa V G, Pedreira C E. Recent advances in decision trees: an updated survey[J]. Artificial Intelligence Review, 2023, 56(5): 4765-4800.
[21] Krizhevsky A, Hinton G. Convolutional deep belief networks on cifar-10[J]. Unpublished manuscript, 2010, 40(7): 1-9.
[22] Guo H, Mao Y, Zhang R. Augmenting data with mixup for sentence classification: An empirical study[J]. arXiv preprint arXiv:1905.08941, 2019.
[23] Pei Z, Cen Z, Huang Y, et al. BTTackler: A Diagnosis-based Framework for Efficient Deep Learning Hyperparameter Optimization[C]//Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. 2024: 2340-2351.
[24] Alnaasan N, Ramesh B, Yao J, et al. HyperSack: Distributed Hyperparameter Optimization for Deep Learning using Resource-Aware Scheduling on Heterogeneous GPU Systems[C]//2024 IEEE 31st International Conference on High Performance Computing, Data, and Analytics (HiPC). IEEE, 2024: 100-110.
[25] Candelieri A, Signori E. e $^ 2$ HPO: energy efficient Hyperparameter Optimization via energy-aware multiple information source Bayesian optimization[C]//THE 19TH LEARNING AND INTELLIGENT OPTIMIZATION CONFERENCE.
[26] Deng S, Zhao H, Fang W, et al. Edge intelligence: The confluence of edge computing and artificial intelligence[J]. IEEE Internet of Things Journal, 2020, 7(8): 7457-7469.
|