[1] HOFMANN T.Probabilistic latent semantic analysis[C]//Proceedings of the 15th Conference on Uncertainty in Artificial Intelligence.[S.l.]:Morgan Kaufmann Publishers Inc., 1999:289-296. [2] BLEI D M, NG A Y, JORDAN M I.Latent Dirichlet allocation[J].Journal of Machine Learning Research, 2003, 3(Jan):993-1022. [3] LIN C J.Projected gradient methods for nonnegative matrix factorization[J].Neural Computation, 2007, 19(10):2756-2779. [4] THRUN S, MITCHELL T M.Lifelong robot learning[J].Robotics and Autonomous Systems, 1995, 15(1/2):25-46. [5] CHEN Z Y, LIU B.Lifelong machine learning[J].Synthesis Lectures on Artificial Intelligence and Machine Learning, 2016, 10(3):1-145. [6] CHEN Z Y, LIU B, BRACHMAN R, et al.Lifelong machine learning second edition[M].Vermont, USA:Morgan & Claypool, 2018. [7] CHEN Z Y, LIU B.Topic modeling using topics from many domains, lifelong learning and big data[C]//Proceedings of the 31st International Conference on Machine Learning.[S.l.]:JMLR, 2014:1-10. [8] 刘一宁, 申彦明.基于终身机器学习的主题挖掘与评分预测联合模型[J].计算机工程, 2019, 45(6):237-241, 248. LIU Y N, SHEN Y M.Topic mining and ratings prediction joint model based on lifelong machine learning[J].Computer Engineering, 2019, 45(6):237-241, 248.(in Chinese) [9] CHEN Z Y, LIU B.Mining topics in documents:standing on the shoulders of big data[C]//Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.New York, USA:ACM Press, 2014:1116-1125. [10] HA Q T, PHAM T N, NGUYEN V Q, et al.A new lifelong topic modeling method and its application to vietnamese text multi-label classification[C]//Proceedings of Asian Conference on Intelligent Information and Database Systems.Berlin, Germany:Springer, 2018:200-210. [11] XU M Y, YANG R X, HARENBERG S, et al.A lifelong learning topic model structured using latent embeddings[C]//Proceedings of the 11th International Conference on Semantic Computing.Washington D.C., USA:IEEE Press, 2017:260-261. [12] XU Y S, YIN Y Y, YIN J W.Tackling topic general words in topic modeling[J].Engineering Applications of Artificial Intelligence, 2017, 62:124-133. [13] CHEN Y, WU J J, LIN J Y, et al.Affinity regularized non-negative matrix factorization for lifelong topic modeling[J].IEEE Transactions on Knowledge and Data Engineering, 2020, 32(7):1249-1262. [14] QIN X R, LU Y Y, CHEN Y F, et al.Lifelong learning of topics and domain-specific word embeddings[C]//Proceedings of ACL-IJCNLP 2021.Stroudsburg, USA:Association for Computational Linguistics, 2021:2294-2309. [15] QIAN J, WANG H, ELSHERIEF M, et al.Lifelong learning of hate speech classification on social media[C]//Proceedings of 2021 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.Stroudsburg, USA:Association for Computational Linguistics, 2021:2304-2314. [16] GUPTA P, CHAUDHARY Y, RUNKLER T, et al.Neural topic modeling with continual lifelong learning[EB/OL].[2021-08-10].https://arxiv.org/abs/2006.10909. [17] RAHUL A.Detection of intruders and flooding in voip using IDS, jacobson fast and hellinger distance algorithms[J].IOSR Journal of Computer Engineering, 2012, 2(2):30-36. [18] ALER R, VALLS J M, BOSTRÖM H.Study of Hellinger distance as a splitting metric for random forests in balanced and imbalanced classification datasets[J].Expert Systems with Applications, 2020, 149:113264. [19] 董明刚, 姜振龙, 敬超.基于海林格距离和SMOTE的多类不平衡学习算法[J].计算机科学, 2020, 47(1):102-109. DONG M G, JIANG Z L, JING C.Multi-class imbalanced learning algorithm based on Hellinger distance and SMOTE algorithm[J].Computer Science, 2020, 47(1):102-109.(in Chinese) [20] DELAMAIRE A, JUGANARU-MATHIEU M, BEIGBEDER M.Correlation between textual similarity and quality of LDA topic model results[C]//Proceedings of the 13th International Conference on Research Challenges in Information Science.Washington D.C., USA:IEEE Press, 2019:1-6. [21] WANG Y, HOUGEN C, OSELIO B, et al.A geometry-driven longitudal topic model[J].Harvard Data Science Review, 2021, 3(2):1-10. [22] MIKOLOV T, CHEN K, CORRADO G, et al.Efficient estimation of word representations in vector space[EB/OL].[2021-08-10].https://arxiv.org/pdf/1301.3781.pdf. [23] 李晚莲, 田俊钦.国际数字人文领域研究前沿探测与发展趋势分析:基于词嵌入和主题建模技术[J].高校图书馆工作, 2021, 41(3):22-28. LI W L, TIAN J Q.Analysis on research frontier detection and development trend of international digital humanities:based on word embedding and topic modelling technology[J].Library Work in Colleges and Universities, 2021, 41(3):22-28.(in Chinese) [24] 朱玉虎.基于词嵌入模型的短文本主题发现研究[D].西安:西安电子科技大学, 2020. ZHU Y H.Topic discovery for short texts based on word embeddings[D].Xi'an:Xidian University, 2020.(in Chinese) [25] 韩亚楠, 刘建伟, 罗雄麟.概率主题模型综述[J].计算机学报, 2021, 44(6):1095-1139. HAN Y N, LIU J W, LUO X L.A survey on probabilistic topic model[J].Chinese Journal of Computers, 2021, 44(6):1095-1139.(in Chinese) [26] 彭俊利, 谷雨, 张震, 等.融合单词贡献度与Word2Vec词向量的文档表示[J].计算机工程, 2021, 47(4):62-67. PENG J L, GU Y, ZHANG Z, et al.Document representation fused with term contribution and Word2Vec word vector[J].Computer Engineering, 2021, 47(4):62-67.(in Chinese) |