[1] SIMONYAN K, ZISSERMAN A.Very deep convolutional networks for large-scale image recognition[EB/OL].[2022-02-05].https://arxiv.org/abs/1409.1556. [2] SZEGEDY C, IOFFE S, VANHOUCKE V, et al.Inception-v4, inception-ResNet and the impact of residual connections on learning[J].Proceedings of the AAAI Conference on Artificial Intelligence, 2017, 31(1):15-20. [3] HE K M, ZHANG X Y, REN S Q, et al.Deep residual learning for image recognition[EB/OL].[2022-02-05].https://arxiv.org/abs/1512.03385. [4] KRIZHEVSKY A, SUTSKEVER I, HINTON G E.ImageNet classification with deep convolutional neural networks[J].Communications of the ACM, 2017, 60(6):84-90. [5] LIU N, MA X L, XU Z Y, et al.AutoCompress:an automatic DNN structured pruning framework for ultra-high compression rates[J].Proceedings of the AAAI Conference on Artificial Intelligence, 2020, 34(4):4876-4883. [6] DENIL M, SHAKIBI B, DINH L, et al.Predicting parameters in deep learning[EB/OL].[2022-02-05].https://arxiv.org/abs/1306.0543. [7] 曾焕强, 胡浩麟, 林向伟, 等.深度神经网络压缩与加速综述[J].信号处理, 2022, 38(1):183-194. ZENG H Q, HU H L, LIN X W, et al.Deep neural network compression and acceleration:an overview[J].Journal of Signal Processing, 2022, 38(1):183-194.(in Chinese) [8] 林景栋, 吴欣怡, 柴毅, 等.卷积神经网络结构优化综述[J].自动化学报, 2020, 46(1):24-37. LIN J D, WU X Y, CHAI Y, et al.Structure optimization of convolutional neural networks:a survey[J].Acta Automatica Sinica, 2020, 46(1):24-37.(in Chinese) [9] 李江昀, 赵义凯, 薛卓尔, 等.深度神经网络模型压缩综述[J].工程科学学报, 2019, 41(10):1229-1239. LI J Y, ZHAO Y K, XUE Z E, et al.A survey of model compression for deep neural networks[J].Chinese Journal of Engineering, 2019, 41(10):1229-1239.(in Chinese) [10] LI S Y, HANSON E, QIAN X H, et al.ESCALATE:boosting the efficiency of sparse CNN accelerator with kernel decomposition[C]//Proceedings of the 54th Annual IEEE/ACM International Symposium on Microarchitecture.Washington D.C., USA:IEEE Press, 2021:992-1004. [11] HAN S, MAO H Z, DALLY W J.Deep compression:compressing deep neural networks with pruning, trained quantization and huffman coding[EB/OL].[2022-02-05].https://arxiv.org/abs/1510.00149. [12] AGHLI N, RIBEIRO E.Combining weight pruning and knowledge distillation for CNN compression[C]//Proceedings of IEEE/CVF Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEE Press, 2021:3185-3192. [13] LI H, KADAV A, DURDANOVIC I, et al.Pruning filters for efficient ConvNets[EB/OL].[2022-02-05].https://www.cs.umd.edu/~hjs/pubs/nipswemdnn16-header.pdf. [14] HE Y, LIU P, WANG Z W, et al.Filter pruning via geometric median for deep convolutional neural networks acceleration[C]//Proceedings of IEEE/CVF Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEE Press, 2019:4335-4344. [15] HE Y H, ZHANG X Y, SUN J.Channel pruning for accelerating very deep neural networks[C]//Proceedings of IEEE International Conference on Computer Vision.Washington D.C., USA:IEEE Press, 2017:1398-1406. [16] ANWAR S, HWANG K, SUNG W.Structured pruning of deep convolutional neural networks[J].ACM Journal on Emerging Technologies in Computing Systems, 2017, 13(3):1-18. [17] 卢海伟, 夏海峰, 袁晓彤.基于滤波器注意力机制与特征缩放系数的动态网络剪枝[J].小型微型计算机系统, 2019, 40(9):1832-1838. LU H W, XIA H F, YUAN X T.Dynamic network pruning via filter attention mechanism and feature scaling factor[J].Journal of Chinese Computer Systems, 2019, 40(9):1832-1838.(in Chinese) [18] LIU Z, LI J G, SHEN Z Q, et al.Learning efficient convolutional networks through network slimming[C]//Proceedings of IEEE International Conference on Computer Vision.Washington D.C., USA:IEEE Press, 2017:2755-2763. [19] ALQAHTANI A.Pruning CNN filters via quantifying the importance of deep visual representations[J].Computer Vision and Image Understanding, 2021, 208/209:103220. [20] KUMAR A, SHAIKH A M, LI Y, et al.Pruning filters with L1-norm and capped L1-norm for CNN compression[J].Applied Intelligence, 2021, 51(2):1152-1160. [21] CHANG X P, PAN H H, LIN W Y, et al.A mixed-pruning based framework for embedded convolutional neural network acceleration[J].IEEE Transactions on Circuits and Systems I:Regular Papers, 2021, 68(4):1706-1715. [22] DING X H, DING G G, GUO Y C, et al.Approximated oracle filter pruning for destructive CNN width optimization[EB/OL].[2022-02-05].https://arxiv.org/abs/1905.04748. [23] YEOM S K, SEEGERER P, LAPUSCHKIN S, et al.Pruning by explaining:a novel criterion for deep neural network pruning[J].Pattern Recognition, 2021, 115:107899. [24] KRIZHEVSKY A.Learning multiple layers of features from Tiny images[EB/OL].[2022-02-05].http://www.cs.toronto.edu/~kriz/learning-features-2009-TR.pdf. [25] LIN M, CHEN Q, YAN S C.Network in network[EB/OL].[2022-02-05].https://arxiv.org/abs/1312.4400. [26] IOFFE S, SZEGEDY C.Batch normalization:accelerating deep network training by reducing internal covariate shift[EB/OL].[2022-02-05].https://arxiv.org/abs/1502.03167. [27] HE Y, KANG G L, DONG X Y, et al.Soft filter pruning for accelerating deep convolutional neural networks[EB/OL].[2022-02-05].https://arxiv.org/abs/1808.06866. [28] MOLCHANOV P, TYREE S, KARRAS T, et al.Pruning convolutional neural networks for resource efficient inference[EB/OL].[2022-02-05].https://arxiv.org/abs/1611.06440. [29] ZHAO C, NI B, ZHANG J, et al.Variational convolutional neural network pruning[C]//Proceedings of IEEE/CVF Conference on Computer Vision and Pattern Recognition.Washington D.C., USA:IEEE Press, 2019:2780-2789. |