[1] KANG Longbiao,HU Baotian,WU Xiangping,et al.A short texts matching method using shallow features and deep features[C]//Proceedings of Natural Language Processing and Chinese Computing.Berlin,Germany:Springer,2014:150-159. [2] WANG Hao,LU Zhengdong,LI Hang,et al.A dataset for research on short-text conversations[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing.Stroudsburg,USA:Association for Computational Linguistics,2013:935-945. [3] KENTER T,RIJKE M D.Short text similarity with word embeddings[C]//Proceedings of ACM International on Conference on Information and Knowledge Management.New York,USA:ACM Press,2015:1411-1420. [4] JIJKOUN V,RIJKE M D.Recognizing textual entailment using lexical similarity[EB/OL].[2018-07-01].http://u.cs.biu.ac.il/~nlp/RTE1/Proceedings/jijkoun_and_de_rijke.pdf. [5] ISLAM A,INKPEN D.Semantic text similarity using corpus-based word similarity and string similarity[J].ACM Transactions on Knowledge Discovery from Data,2008,2(2):1-25. [6] KUSNER M J,SUN Yu,KOLKIN N I,et al.From word embeddings to document distances[C]//Proceedings of the 32nd International Conference on Machine Learning.New York,USA:ACM Press,2015:957-966. [7] RUMELHART D E,HINTON G E,WILLIAMS R J.Learning representations by back-propagating errors[J].Nature,1986,323(6088):399-421. [8] MIKOLOV T,CHEN Kai,CORRADO G,et al.Efficient estimation of word representations in vector space[EB/OL].[2018-07-01].https://arxiv.org/abs/1301.3781. [9] MIKOLOV T,SUTSKEVER I,CHEN Kai,et al.Distributed representations of words and phrases and their compositionality[C]//Proceedings of the 26th International Conference on Neural Information Processing Systems.New York,USA:ACM Press,2013:3111-3119. [10] BARONI M,DINU G,KRUSZEWSKI G.Don't count,predict! a systematic comparison of context-counting vs.context-predicting semantic vectors[C]//Proceedings of Meeting of the Association for Computational Linguistics.Stroudsburg,USA:Association for Computational Linguistics,2014:238-247. [11] 黄江平,姬东鸿.基于句子语义距离的释义识别研究[J].四川大学学报(工程科学版),2016,48(6):202-207. [12] 高云龙,左万利,王英,等.基于稀疏自学习卷积神经网络的句子分类模型[J].计算机研究与发展,2018,55(1):179-187. [13] 张琦,彭志平.融合注意力机制和CNN-GRNN模型的读者情绪预测[J].计算机工程与应用,2018,54(13):168-174. [14] CHEN Kai,WANG Jiang,CHEN Liangchieh,et al.ABC-CNN:an attention based convolutional neural network for visual question answering[EB/OL].[2018-07-01].https://arxiv.org/pdf/1511.05960.pdf. [15] XIAO Tianjun,XU Yichong,YANG Kuiyuan,et al.The application of two-level attention models in deep convolutional neural network for fine-grained image classification[C]//Proceedings of 2015 IEEE Conference on Computer Vision and Pattern Recognition.Washington D.C.,USA:IEEE Press,2015:842-850. [16] 徐俊.基于视觉的文本生成方法研究[D].合肥:中国科学技术大学,2018. [17] 李青山.基于注意力选择机制的图像分割与场景理解[D].上海:上海交通大学,2012. [18] CAO Chunshui,LIU Xianming,YANG Yi,et al.Look and think twice:capturing top-down visual attention with feedback convolutional neural networks[C]//Proceedings of 2015 IEEE International Conference on Computer Vision.Washington D.C.,USA:IEEE Press,2015:2956-2964. [19] BAHDANAU D,CHO K,BENGIO Y.Neural machine trans-lation by jointly learning to align and translate[EB/OL].[2018-07-01].https://arxiv.org/pdf/1409.0473v6.pdf. [20] 刘宇鹏,马春光,张亚楠.深度递归的层次化机器翻译模型[J].计算机学报,2017,40(4):861-871. [21] BOWMAN S R,POTTS C,MANNING C D.Recursive neural networks can learn logical semantics[EB/OL].[2018-07-01].https://arxiv.org/pdf/1406.1827.pdf. [22] ROCKTÄSCHEL T,GREFENSTETTE E,HERMANN K M,et al.Reasoning about entailment with neural attention[EB/OL].[2018-07-01].https://arxiv.org/pdf/1509.06664.pdf. [23] 冯兴杰,张志伟,史金钏.基于卷积神经网络和注意力模型的文本情感分析[J].计算机应用研究,2018,35(5):1434-1436. [24] YIN WENPENG,KANN K,YU Mo,et al.Comparative study of cnn and rnn for natural language processing[EB/OL].[2018-07-01].https://arxiv.org/pdf/1702.01923.pdf. [25] KIM Y.Convolutional neural networks for sentence classification[EB/OL].[2018-07-01].https://arxiv.org/pdf/1408.5882.pdf. |