作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程 ›› 2011, Vol. 37 ›› Issue (13): 66-67,70. doi: 10.3969/j.issn.1000-3428.2011.13.020

• 软件技术与数据库 • 上一篇    下一篇

决策树分类算法研究

张 琳,陈 燕,李桃迎,牟向伟   

  1. (大连海事大学交通运输管理学院,辽宁 大连 116026)
  • 收稿日期:2010-11-15 出版日期:2011-07-05 发布日期:2011-07-05
  • 作者简介:张 琳(1984-),女,博士研究生,主研方向:数据挖掘,决策支持;陈 燕,教授、博士生导师;李桃迎、牟向伟,博士研究生
  • 基金资助:
    国家自然科学基金资助项目(70940008);高等学校博士学科点专项科研基金资助项目(200801510001)

Research on Decision Tree Classification Algorithms

ZHANG Lin, CHEN Yan, LI Tao-ying, MU Xiang-wei   

  1. (College of Transportation Management, Dalian Maritime University, Dalian 116026, China)
  • Received:2010-11-15 Online:2011-07-05 Published:2011-07-05

摘要: ID3算法在选择分裂属性时偏向于选取属性取值较多的属性。针对该问题,引入属性重要性和属性取值数量2个参数对ID3算法的信息增益公式进行改进,从而提高取值数量少但较为关键的属性的重要性,使算法更好地反映实际决策情况,并根据凸函数的性质简化信息熵的计算,提高决策树的构造效率。通过实例介绍改进算法的具体应用方法,证明其性能相比原算法有所提高。

关键词: ID3算法, 信息增益, 属性重要性, 属性取值数量, 信息熵

Abstract: ID3 algorithm tends to choose the attributes of more values as the splitting attributes. Aiming at the problem, this paper introduces two parameters including attribute importance and number of attribute values to improve the existed formula of information gain of ID3 algorithm. This contributes to enhancing the importance of the critical attributes with fewer values and making the algorithm better reflect the actual decision-making situation. According to the properties of the convex function, it simplifies the calculating formula of information entropy to improve the efficiency of constructing a decision tree. A concrete example is given to describe the specific application of improved algorithm, and the result shows that it is more efficient than the original algorithm.

Key words: ID3 algorithm, information gain, attribute importance, number of attribute values, information entropy

中图分类号: