[1] MIKOLOV T,CORRADO G,CHEN Kai,et al.Efficient estimation of word representations in vector space[EB/OL].[2018-07-10].https://arxiv.org/pdf/1301.3781.pdf. [2] PENNINGTON J,SOCHER R,MANNING C.GloVe:global vectors for word representation[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing.Stroudsburg,USA:Association for Computational Linguistics,2014:1532-1543. [3] JOULIN A,GRAVE E,BOJANOWSKI P,et al.Bag of tricks for efficient text classification[C]//Proceedings of Conference of European Chapter of the Association for Computational Linguistics.Stroudsburg,USA:Association for Computational Linguistics,2017:427-431. [4] WIETING J,BANSAL M,GIMPEL K,et al.From paraphrase database to compositional paraphrase model and back[J].Transactions of the Association for Computational Linguistics,2015,3:345-358.
[5] PETERS M E,NEUMANN M,IYYER M,et al.Deep contextualized word representations[EB/OL].[2018-07-10].https://arxiv.org/pdf/1802.05365.pdf. [6] GANITKEVITCH J,VANDURME B,CALLISON-BURCH C.PPDB:the paraphrase database[C]//Proceedings of Conference of the North American Chapter of the Association for Computational Linguistics.Stroudsburg,USA:Association for Computational Linguistics,2013:758-764. [7] LE Q,MIKOLOV T.Distributed representations of sentences and documents[C]//Proceedings of the 31st International Conference on International Conference on Machine Learning.Cambridge,USA:MIT Press,2014:1188-1196. [8] IYYER M,MANJUNATHA V,BOYD-GRABER J,et al.Deep unordered composition rivals syntactic methods for text classification[C]//Proceedings of International Joint Conference on Natural Language Processing.Stroudsburg,USA:Association for Computational Linguistics,2015:1681-1691. [9] KIROS R,ZHU Yukun,SALAKHUTDINOV R,et al.Skip-thought vectors[C]//Proceedings of International Conference on Neural Information Processing Systems.Cambridge,USA:MIT Press,2015:3294-3302. [10] TAI Kaisheng,SOCHER R,MANNING C D.Improved semantic representations from tree-structured long short-term memory networks[C]//Proceedings of International Joint Conference on Natural Language Processing.Stroudsburg,USA:Association for Computational Linguistics,2015:1556-1566. [11] WIETING J,BANSAL M,GIMPEL K,et al.Towards universal paraphrastic sentence embeddings[EB/OL].[2018-07-10].https://arxiv.org/pdf/1511.08198.pdf. [12] 段旭磊,张仰森,孙祎卓.微博文本的句向量表示及相似度计算方法研究[J].计算机工程,2017,43(5):143-148. [13] ARORA S,LIANG Y,MA Tengyu.A simple but tough to beat baseline for sentence embeddings[EB/OL].[2018-07-10].https://openreview.net/pdf?id=SyK00v5xx. [14] RÜCKLÉ A,EGER S,PEYRARD M,et al.Concatenated p-mean word embeddings as universal cross-lingual sentence representations[EB/OL].[2018-07-10].https://arxiv.org/pdf/1803.01400.pdf. [15] LANG K.NewsWeeder:learning to filter netnews[C]//Proceedings of International Conference on International Conference on Machine Learning.San Francisco,USA:Morgan Kaufmann Publishers Inc.,1995:331-339. [16] 20 newsgroups[EB/OL].[2018-07-10].http://www.qwone.com/~jason/20Newsgroups. [17] MAAS A L,DALY R E,PHAM P T,et al.Learning word vectors for sentiment analysis[C]//Proceedings of Meeting of the Association for Computational Linguistics.Stroudsburg,USA:Association for Computational Linguistics,2011:142-150. [18] Large movie review dataset[EB/OL].[2018-07-10].http://ai.stanford.edu/~amaas/data/sentiment. [19] GloVe:global vectors for word representation[EB/OL].[2018-07-10].https://nlp.stanford.edu/projects/glove. [20] Wikimedia.English Wikipedia dump[EB/OL].[2018-07-10].http://dumps.wikimedia.org/enwiki/latest/enwiki-latest-page s-articles.xml.bz2. |