计算机工程 ›› 2019, Vol. 45 ›› Issue (8): 190-197.doi: 10.19678/j.issn.1000-3428.0051610

• 人工智能及识别技术 • 上一篇    下一篇

一种基于权重矩阵分解的在线多任务学习算法

张文金   

  1. 广州铁路职业技术学院 信息工程系, 广州 510430
  • 收稿日期:2018-05-21 修回日期:2018-07-10 出版日期:2019-08-15 发布日期:2019-08-08
  • 作者简介:张文金(1972-),男,副教授、硕士,主研方向为数据挖掘、图像处理。
  • 基金项目:
    广东省科技计划项目(2015A030401005)。

An Online Multi-Task Learning Algorithm Based on Weight Matrix Decomposition

ZHANG Wenjin   

  1. Department of Information Engineering, Guangzhou Railway Polytechnic, Guangzhou 510430, China
  • Received:2018-05-21 Revised:2018-07-10 Online:2019-08-15 Published:2019-08-08

摘要: 在线多任务学习(MTL)算法大多利用单个权重矩阵约束任务相关性,且该约束较为严格,在实践中难以满足。为此,提出一种改进的在线MTL算法,通过将权重矩阵分解为2个子矩阵来克服上述约束。对第1个子矩阵进行迹-范数正规化,获得低秩相关结构。利用正规化项对第2个子矩阵进行个性化任务的群组式Lasso惩罚,确定个性化模式。采用投影梯度算法对子矩阵进行自适应学习并获得最优解。实验结果表明,该算法相对于最优线性后验模型可实现次线性遗憾,其预测精度、运行速度优于TRML、MTFL等算法,且在垃圾邮件数据集上的累计误差率可降至4.97%。

关键词: 多任务学习, 权重矩阵, 相关性结构, 个性化模式, 次线性遗憾

Abstract: Most algorithms for online Multi-Task Learning(MTL) constrain task relatedness via a single weight matrix,and the constraint is so strict that it can not always hold in practice.Thus an improved online MTL algorithm is proposed to overcome this constraint by decomposing the weight matrix into two sub-matrics.A trace-norm regularization is imposed on the first sub-matrics to induce a low-rank correlative structure.A group Lasso penalty over individual tasks is applied on the second sub-matrics through a regularization term to identify personalized patterns.The projection gradient algorithm is applied to the adaptive learning of these sub-matrices and the optimal solution is obtained.Experimental results show that the proposed algorithm can achieve a sub-linear regret with respect to the best linear model in hindsight,the accuracy and the running time of the algorithm is better than TRML、MTFL and other algorithms,its cumulative error rate on the spam mail data set can be reduced to 4.97%.

Key words: Multi-Task Learning(MTL), weight matrix, correlative structure, personalized patterns, sub-linear regret

中图分类号: