[1] LI J,LIU H.Challenges of feature selection for big data analytics[J].IEEE Intelligent Systems,2017,32(2):9-15. [2] BOLON-CANEDO V,SANCHEZ-MARONO N,ALONSO-BETANZOS A.Feature selection for high dimensional data[J].Progress in ArtificialIntelligence,2016,5(2):65-75. [3] DESSÌ N,PES B.Similarity of feature selection methods:an empirical study across data intensive classification tasks[J].Expert Systems with Applications,2015,42(10):4632-4642. [4] 初蓓,李占山,张梦林,等.基于森林优化特征选择算法的改进研究[J].软件学报,2018,29(9):2547-2558. CHU B,LI Z S,ZHANG M L,et al.Research on improvements of feature selection using forest optimizationalgorithm[J].Journal of Software,2018,29(9):2547-2558.(in Chinese) [5] 于巧,姜淑娟,张艳梅,等.分类不平衡对软件缺陷预测模型性能的影响研究[J].计算机学报,2018,41(4):809-824. YU Q,JIANG S J,ZHANG Y M,et al.The impact study of class imbalance on the performance of software defect prediction models[J].Chinese Journal of Computers,2018,41(4):809-824.(in Chinese) [6] HUANG X,ZOU Y,WANG Y,et al.Cost-sensitive sparse linear regression for crowd counting with imbalanced training data[C]//Proceedings of International Conference on Multimedia and Expo.Washington D.C.,USA:IEEE Press,2016:1-6. [7] FORMAN G.An extensive empirical study of feature selection metrics for text classification[J].Journal of Machine Learning Research,2003,3(2):1289-1305. [8] CHEN X,WASIKOWSKI M.Fast:a roc-based feature selection metric for small samples and imbalanced data classificationproblems[C]//Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.New York,USA:ACM Press,2008:124-132. [9] SUN Z,BEBIS G,MILLER R H,et al.Object detection using feature subset selection[J].Pattern Recognition,2004,37(11):2165-2176. [10] SOMOL P,PUDIL P,KITTLER J,et al.Fast branch &bound algorithms for optimal feature selection[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2004,26(7):900-912. [11] WUTZL B,LEIBNITZ K,RATTAY F,et al.Genetic algorithms for feature selection when classifying severe chronic disorders of consciousness[J].PloSone,2019,14(7):1265-1278. [12] 李元香,项正龙,张伟艳.模拟退火算法的弛豫模型与时间复杂性分析[J].计算机学报,2020,43(5):796-811. LI Y X,XIANG Z L,ZHANG W Y.A relaxation model and time complexity analysis for simulated annealing algorithm[J].Chinese Journal of Computers,2020,43(5):796-811.(in Chinese) [13] GHIMATGAR H,KAZEMI K,HELFROUSH M S,et al.An improved feature selection algorithm based on graph clustering and ant colony optimization[J].Knowledge Based Systems,2018,159:270-285. [14] IMOLA F.A survey of dimension reduction techniques[J].Neoplasia,2002,7(5):475-485. [15] KIRA K,RENDELL L A.Feature selection problem:traditional methods and a new algorithm[C]//Proceedings of the 20th National Conference on Artificial Intelligence.San Jose,USA:AAAI Press,1992:129-134. [16] KONONENKO I.Estimating attributes:analysis and extensions of RELIEF[C]//Proceedings of European Conference on Machine Learning.Berlin,Germany:Springer,1994:171-182. [17] 张振海,李士宁,李志刚,等.一类基于信息熵的多标签特征选择算法[J].计算机研究与发展,2013,50(6):1177-1184. ZHANG Z H,LI S N,LI Z G,et al.Multi-label feature selection algorithm based on information entropy[J].Journal of Computer Research and Development,2013,50(6):1177-1184.(in Chinese) [18] PÁRRAGA-VALLE J,GARCÍA-BERMÚDEZ R,ROJAS F,et al.Evaluating mutual information and chi-square metrics in text features selection process:a study case applied to the text classification in PubMed[C]//Proceedings of International Work-Conference on Bioinformatics and Biomedical Engineering.Berlin,Germany:Springer,2020:636-646. [19] NAGPAL A,SINGH V.Feature selection from high dimensional data based on iterative qualitative mutual information[J].Journal of Intelligent and Fuzzy Systems,2019,36(6):5845-5856. [20] CHAWLA N V,BOWYER K W,HALL L O,et al.SMOTE:synthetic minority over-sampling technique[J].Journal of Artificial Intelligence Research,2002,16(1):321-357. [21] ARIDAS C K,KARLOS S,KANAS V G,et al.Uncertainty based under-sampling for learning naive Bayes classifiers under imbalanced data sets[J].IEEE Access,2020,7:2122-2133. [22] ZHANG S.Decision tree classifiers sensitive to heterogeneous costs[J].Journal of Systems and Software,2012,85(4):771-779. [23] LOYOLAGONZALEZ O,MARTINEZTRINIDAD J F,CARRASCOOCHOA J A,et al.Cost-sensitive pattern-based classification for class imbalance problems[J].IEEE Access,2019,7:60411-60427. [24] ABDOH S F,RIZKA M A,MAGHRABY F A,et al.Cervical cancer diagnosis using random forest classifier with SMOTE and feature reduction techniques[J].IEEE Access,2018,5:59475-59485. [25] SUN J,LEE Y,LI H,et al.Combining B&B-based hybrid feature selection and the imbalance-oriented multiple-classifier ensemble for imbalanced credit risk assessment[J].Technological & Economic Development of Economy,2015,21(3):351-378. [26] BIAN J,PENG X,WANG Y,et al.An efficient cost-sensitive feature selection using chaos genetic algorithm for class imbalance problem[J].Mathematical Problems in Engineering,2016(10):1-9. [27] MOAYEDIKIA A,ONG K,BOO Y L,et al.Feature selection for high dimensional imbalanced class data using harmony search[J].Engineering Applications of Artificial Intelligence,2017,57:38-49. [28] GUYON I,ELISSEEFF A.An introduction to variable and feature selection[J].Journal of Machine Learning Research,2003,3(6):1157-1182. [29] JAKULIN A,BRATKO I.Testing the significance of attribute interactions[C]//Proceedings of the 21st IEEE International Conference on Machine Learning.Washington D.C.,USA:IEEE Press,2004:409-416. [30] JAKULIN A.Attribute interactions in machine learning[D].[S.1.]:University of Ljubljana,2003. |