作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程 ›› 2023, Vol. 49 ›› Issue (4): 120-124. doi: 10.19678/j.issn.1000-3428.0064345

• 人工智能与模式识别 • 上一篇    下一篇

神经网络滤波器竞争训练

安志国, 彭政, 易满成, 刘健欣, 俞思帆   

  1. 广东电网有限责任公司广州供电局, 广州 510630
  • 收稿日期:2022-03-31 修回日期:2022-05-03 发布日期:2022-06-20
  • 作者简介:安志国(1977-),男,高级工程师、硕士,主研方向为电力安全智能化管理;彭政,工程师;易满成,高级工程师、硕士;刘健欣(通信作者),学士;俞思帆,工程师。
  • 基金资助:
    中国南方电网有限责任公司科技项目(GZHKJXM20200058)。

Filter Competition Training of Neural Networks

AN Zhiguo, PENG Zheng, YI Mancheng, LIU Jianxin, YU Sifan   

  1. Guangzhou Power Supply Bureau of Guangdong Power Grid Co., Ltd., Guangzhou 510630, China
  • Received:2022-03-31 Revised:2022-05-03 Published:2022-06-20

摘要: 非重要权重元素的修剪和重新激活可避免神经网络过度参数化,然而权重元素的重新激活一般是通过激活整个滤波器实现,分类准确率不高。针对该问题,在神经网络训练过程中提出一种滤波器权值竞争训练算法。在局部和全局范围内选择并定位劣质滤波器,根据前向匹配策略寻找相应的优质滤波器,使用其中的最优和次优权重元素交叉更新劣质滤波器中的次劣和最劣权重元素,在神经网络结构上使陷入局部极值的权值进行重新激活。实验结果表明,应用滤波器权值竞争训练算法的ResNet、DenseNet等普通神经网络在CIFAR数据集上的分类准确率和在ImageNet数据集上的Top-1准确率平均提升了0.79和1.13个百分点,MobileNet、ShuffleNet等轻量级神经网络平均提升了2.22和2.93个百分点,优于现有的滤波器竞争训练算法。

关键词: 神经网络, 权值竞争, 重新激活, 滤波器剪枝, 插件式训练

Abstract: Pruning and reactivation of unimportant weight elements can prevent overparameterization of neural networks.However, the reactivation of weight elements is typically achieved by activating the entire filter, where the classification accuracy is not high.To solve this problem, a filter weight competition training algorithm is proposed to train neural networks.In the local and global range, the low-quality filter is selected and located, and the corresponding high-quality filter based on the forward matching strategy is found.Optimal and suboptimal weight elements are used to cross-reference and update the subworst and worst weight elements in the low-quality filter.In the neural network, the weights trapped in local extrema are reactivated.Experimental results show that the classification accuracy of ordinary neural networks such as ResNet and DenseNet on the CIFAR dataset and the Top-1 accuracy on the ImageNet dataset trained by filter weight competition algorithm increase by 0.79 and 1.13 percentage points on average, respectively, as compared with the benchmark network.Similarly, lightweight neural networks such as MobileNet and ShuffleNet increase by 2.22 and 2.93 percentage points.The neural network image classification effect following filter weight competition training is thus better than the existing filter competition training algorithm.

Key words: neural network, weight competition, reactivation, filter pruning, plug-in training

中图分类号: