计算机工程

• 人工智能及识别技术 • 上一篇    下一篇

一种具有最优收敛速度的正则化境面下降算法

王惊晓,高乾坤,汪群山   

  1. (解放军陆军军官学院十一系,合肥 230031)
  • 收稿日期:2013-03-05 出版日期:2014-06-15 发布日期:2014-06-13
  • 作者简介:王惊晓(1988-),女,硕士研究生,主研方向:模式识别,人工智能;高乾坤,硕士研究生;汪群山,讲师。
  • 基金项目:
    国家自然科学基金资助项目(61273296);安徽省自然科学基金资助项目(1308085QF121)。

A Regularization Mirror Descent Algorithm with Optimal Convergence Rate

WANG Jing-xiao, GAO Qian-kun, WANG Qun-shan   

  1. (The 11th Department, Army Officer Academy of PLA, Hefei 230031, China)
  • Received:2013-03-05 Online:2014-06-15 Published:2014-06-13

摘要: Pegasos算法是求解大规模支持向量机问题的有效方法,在随机梯度下降过程中植入多阶段循环步骤,能使该算法得到最优的收敛速度O(1/T)。COMID算法是由镜面下降算法推广得到的正则化随机形式,可保证正则化项的结构,但对于强凸的优化问题,该算法的收敛速度仅为O(logT/T)。为此,在COMID算法中引入多阶段循环步骤,提出一种求解L1+L2混合正则化项问题的最优正则化镜面下降算法,证明其具有最优的收敛速度O(1/T),以及与COMID算法相同的稀疏性。在大规模数据库上的实验结果验证了理论分析的正确性和所提算法的有效性。

关键词: 机器学习, 在线学习, 随机优化, 稀疏性, L1+L2正则化, 收敛速度

Abstract: Pegasos algorithm is an efficient method to solve large-scale Support Vector Machine(SVM) problems. The recent study shows that Pegasos can achieve the optimal O(1/T) convergence rate by adding epoches to the procedure of stochastic gradient descent. COMID(Composite Objective Mirror Descent) algorithm is a stochastic regularized case extended by mirror descent method, which can ensure the structure of regularization. However, for strongly-convex problems, the convergence rate of COMID is only O(logT/T). Aiming at this problem, this paper introduces the epoches to COMID and presents an optimal regularization mirror descent algorithm for solving hybrid L1 and L2 regularization problem. It proves that this algorithm achieves the optimal O(1/T) convergence rate and obtains the same sparsity as COMID algorithm. Experimental results on large-scale datasets demonstrate the correctness of theoretic analysis and effective- ness of the proposed algorithm.

Key words: machine learning, online learning, stochastic optimization, sparsity, L1 and L2 regularization, convergence rate

中图分类号: