作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程

• 人工智能及识别技术 • 上一篇    下一篇

基于噪声自检测的分段非线性组合Adaboost改进算法

张才,陈优广   

  1. (华东师范大学 计算中心,上海 200062)
  • 收稿日期:2016-05-18 出版日期:2017-05-15 发布日期:2017-05-15
  • 作者简介:张才(1991—),男,硕士研究生,主研方向为数据挖掘、图形处理;陈优广(通信作者),副教授、博士。

Improved Piecewise Nonlinear Combinatorial Adaboost Algorithm Based on Noise Self-detection

ZHANG Cai,CHEN Youguang   

  1. (Computing Center,East China Normal University,Shanghai 200062,China)
  • Received:2016-05-18 Online:2017-05-15 Published:2017-05-15

摘要: 针对传统Adaboost算法对有噪声样本敏感的问题以及线性相加基分类器的不合理性,提出一种噪声自检测的分段非线性组合Adaboost算法(NDK Adaboost)。NDK Adaboost利用传统Adaboost算法的训练误差率随迭代次数呈指数下降的特点直接构造检测噪声模型来识别噪声,并且在预测阶段将预测样本映射到训练样本的相对位置,根据其邻近的样本分布决定基分类器的权重,从而使算法在不同的样本分布中具有较高的分类准确率。实验结果表明,与传统Adaboost算法以及Adaboost相关的改进算法相比,该算法具有较高的分类准确率。

关键词: 噪声检测, 传统Adaboost, 分段, 基分类器, 邻近样本, 权重

Abstract: As traditional Adaboost algorithm is sensitive to noisy sample and the linear combination of base classifiers is irrational,a piecewise nonlinear Adaboost algorithm based on noise self-detection called NDK Adaboost is proposed.NDK Adaboost,drawing on traditional Adaboost algorithm whose error rate in training set decreases with iteration times exponentially,establishes directly a noise detection model to recognize noise,and maps the prediction samples to the relative positions of the training set.According to the neighbor samples’ distribution,it determines the weight of the base classifier.A higher classification accuracy rate of the algorithm can be drawn out among the different sample distribution.Experimental results show that,compared with the traditional Adaboost algorithm and the related improved algorithms,NDK Adaboost has a higher classification accuracy rate.

Key words: noise detection, traditional Adaboost, piecewise, base classifier, neighbor sample, weight

中图分类号: