作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程 ›› 2008, Vol. 34 ›› Issue (14): 208-209. doi: 10.3969/j.issn.1000-3428.2008.14.074

• 人工智能及识别技术 • 上一篇    下一篇

用于模式分类的动态有指导前向传播网络

邓 伟1,苏美娟1,董恩清2   

  1. (1. 苏州大学计算机科学与技术学院,苏州 215006;2. 苏州大学电子信息学院,苏州 215006)
  • 收稿日期:1900-01-01 修回日期:1900-01-01 出版日期:2008-07-20 发布日期:2008-07-20

Dynamic Supervised Forward Propagation Network for Pattern Classification

DENG Wei1, SU Mei-juan1, DONG En-qing2   

  1. (1. School of Computer Science & Technology, Soochow University, Suzhou 215006; 2. School of Electronic Information, Soochow University, Suzhou 215006)
  • Received:1900-01-01 Revised:1900-01-01 Online:2008-07-20 Published:2008-07-20

摘要: 以改进的仅前向型对传网络(CPN)为基础,研究一种用于模式分类的神经网络——动态有指导前向传播网络(DSFPN)。其隐层用修正的第2种学习矢量量化算法,以增量训练策略,进行有指导训练。在训练过程中,根据适合度产生新的隐层神经元,使隐层动态增长。Cone-Torus平面点分类和非特定人孤立数字语音识别的实验结果表明了DSFPN的优越性能,其训练时间比多层感知器少2个数量级,训练速度比改进的CPN更快,最好测试正确率分别达92.25%和98.7%,高于另外2种神经网络。

关键词: 模式分类, 神经网络, 有指导训练, 动态增长

Abstract: This paper presents a new type of neural network for pattern classification namely Dynamic Supervised Forward Propagation Network(DSFPN). Although the network is based on the improved forward-only version of the Counterpropagation Network(CPN), its hidden layer is trained in a supervised manner by using a modified second version of Learning Vector Quantization(LVQ2) algorithm, and an incremental training strategy is adopted. New hidden neurons are created based on a measure of suitability during the training with dynamic growth of the hidden layer. Experimental results of Cone-Torus planar point classification and speaker-independent isolated digital word speech recognition show that DSFPN requires two order of magnitude less training time than the Multilayer Perceptron(MLP), and has higher training speed than the improved CPN, while the best testing accuracies achieve 92.25% and 98.7%, better than the two networks.

Key words: pattern classification, neural network, supervised training, dynamic growth

中图分类号: