WANG Xiaoming, XU Tao, RAN Biao
Existing Support Vector Guided Dictionary Learning(SVGDL) algorithm based on the principle of large-margin classification.When establishing decision-making hyperplanes,the algorithms consider only the boundary conditions of each class of encoding vectors,but ignore data distribution information,which limits the generalization ability of the model.To address the problem,this paper proposes a Minimum Class Variance Support Vector Guided Dictionary Learning(MCVGDL) algorithm.First,MCVGDL takes the Minimum Class Variance Support Vector Machine(MCVSVM),which combines Fisher linear discriminant analysis and the large margin classification principle of Support Vector Machine(SVM),as discriminant term.Second,during alternate optimization of model classifiers,MCVGDL comprehensively takes the distribution information of encoding vectors into account,to guarantee the overall consistency of encoding vectors of similar samples and reduce the coupling degree of corresponding components between vectors and modifies SVM classification vectors.So,the discriminant information of encoding vectors can be fully mined to better guide dictionary learning,improving the classification performance.Experimental results on face,object,and handwritten digit recognition datasets show that in terms of the recognition rate and atomic robustness,the proposed algorithm outperforms classical dictionary learning algorithms,including K Singular Value Decomposition(KSVD) and Local Constrained and Label Embedding Dictionary Learning(LCLE-DL),etc.