参考文献
[1]BLEI D M,NG A Y,JORDAN M I.Latent Dirichlet Allocation[J].Journal of Machine Learning Research,2003,3:993-1022.
[2]BLEI D M.Probabilistic Topic Models[J].Communi-cations of the ACM,2011,55(4):55-65.
[3]WALLACH H M.Topic Modeling:Beyond Bag-of-words[C]//Proceedings of the 23rd International Con-ference on Machine Learning.New York,USA:ACM Press,2006:977-984.
[4]XIE Pengtao,YANG Diyi,XING E.Incorporating Word Correlation Knowledge into Topic Modeling[C]//Proceedings of 2015 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.[S.l.]:Association for Computational Linguistics,2015:725-734.
[5]BENGIO Y,DUCHARME R,VINCENT P,et al.A Neural Probabilistic Language Model[J].Journal of Machine Learning Research,2003,3:1137-1155.
[6]TURIAN J,RATINOV L,BENGIO Y.Word Representations:A Simple and General Method for Semi-supervised Learning[C]//Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics.[S.l.]:Association for Computational Linguistics,2010:384-394.
[7]陈恩红,邱思语,许畅,等.单词嵌入——自然语言的连续空间表示[J].数据采集与处理,2014,29(1):19-29.
[8]COLLOBERT R,WESTON J.A Unified Architecture for Natural Language Processing:Deep Neural Networks with Multitask Learning[C]//Proceedings of the 25th International Conference on Machine Learning.Helsinki,Finland:[s.n.],2008:160-167.
[9]MIKOLOV T,CHEN Kai,CORRADO G,et al.Efficient Estimation of Word Representations in Vector Space[EB/OL].(2013-09-07).https://arxiv.org/pdf/1301.3781v3.pdf.
[10]PENNINGTON J,SOCHER R,MANNING C.Glove:Global Vectors for Word Representation[C]//Proceedings of Con-ference on Empirical Methods in Natural Language Processing.[S.l.]:Association for Computational Linguistics,2014:1532-1543.
[11]SALAKHUTDINOV R,HINTON G E.Replicated Softmax:An Undirected Topic Model[M]//BENGIO Y,SCHUURMANS D,LAFFERTY J D.Advances in Neural Information Processing Systems 22.[S.l.]:Neural Infor-mation Processing Systems Foundation,Inc.,2009:1607-1614.
[12]SRIVASTAVA N,SALAKHUTDINOV R R,HINTON G E.Modeling Documents with Deep Boltzmann Machines[C]//Proceedings of the 29th Conference on Uncertainty in Artificial Inteligence.Bellevue,USA:Association for Uncertainty in Artificial Intelligence,2013:616-324.
[13]CAO Ziqiang,LI Sujian.LIU Yang,et al.A Novel Neural Topic Model and Its Supervised Extension[C]//Proceedings of the 29th AAAI Conference on Artificial Inteligence.Austin,USA:AAAI,2015:2210-2216.
[14]LIU Yang,LIU Zhiyuan,CHUA T S,et al.Topical Word Embeddings[C]//Proceedings of the 29th AAAI Conference on Artificial Inteligence.Austin,USA:AAAI,2015:2418-2424.
[15]NGUYEN D Q,BILLINGSLEY R,DU Lan,et al.Improving Topic Models with Latent Feature Word Representations[J].Transactions of the Association for Computational Linguistics,2015,3:299-313.
[16]MIKOLOV T,SUTSKEVER I,CHEN K,et al.Distributed Representations of Words and Phrases and Their Compositionality[M]//BURGES C J C,BOTTOU L,WELLING M.Advances in Neural Information Processing Systems 26.[S.l.]:Neural Information Processing Systems Foundation,Inc.,2013:3111-3119.
[17]NIU Liqiang,DAI Xinyu,ZHANG Jianbing.Topic2Vec:Learning Distributed Representations of Topics[EB/OL].[2016-09-10].https://arxiv.org/pdf/1506.08422v1.pdf.
[18]FU Xianghua,YANG Kun,HUANG J Z,et al.Dynamic Non-parametric Joint Sentiment Topic Mixture Model[J].Knowledge-Based Systems,2015,82(C):102-114.
编辑金胡考 |