[1] ZHANG Zhichang,YAO Dongren,LIU Xie,et al.Textual entailment recognition fused with syntactic structure transformation and lexical semantic features[J].Computer Engineering,2015,41(9):199-204.(in Chinese)张志昌,姚东任,刘霞,等.融合句法结构变换与词汇语义特征的文本蕴涵识别[J].计算机工程,2015,41(9):199-204. [2] LAN Wuwei,XU Wei.Neural network models for paraphrase identification,semantic textual similarity,natural language inference,and question answering[C]//Proceedings of the 27th International Conference on Computational Linguistics.Santa Fe,USA:[s.n.],2018:3890-3902. [3] GUO Maosheng,ZHANG Yu,LIU Ting.Research advances and prospect of recognizing textual entailment and knowledge acquisition[J].Chinese Journal of Computers,2017,40(4):119-140.(in Chinese)郭茂盛,张宇,刘挺.文本蕴含关系识别与知识获取研究进展及展望[J].计算机学报,2017,40(4):119-140. [4] BOWMAN S R,ANGELI G,POTTS C,et al.A large annotated corpus for learning natural language inference[J].Empirical Methods in Natural Language Processing,2015,41:632-642. [5] YANG Z,YANG D,DYER C,et al.Hierarchical attention networks for document classification[EB/OL].[2019-04-10].https://arxiv.org/abs/1707.00896v1. [6] CHEN Qian,ZHU Xiaodan,LING Zhenhua,et al.Enhanced LSTM for natural language inference[C]//Proceedings of IEEE Meeting on Association for Computational Linguistics.Washington D.C.,USA:IEEE Press,2017:1657-1668. [7] NIE Y,BANSAL M.Shortcut-stacked sentence encoders for multi-domain inference[C]//Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP.Washington D.C.,USA:IEEE Press,2017:165-178. [8] CONNEAU A,KIELA D,SCHWENK H,et al.Supervised learning of universal sentence representations from natural language inference data[EB/OL].[2019-04-10].https://arxiv.org/abs/1705.02364v5. [9] TALMAN A,YLIJYRA A,TIEDEMANN J,et al.Natural language inference with hierarchical BiLSTM max pooling architecture[EB/OL].[2019-04-10].https://arxiv.org/abs/1808.08762v1. [10] IM J,CHO S.Distance-based self-attention network for natural language inference[EB/OL].[2019-04-10].https://arxiv.org/abs/1712.02047v1. [11] SHEN Tao,ZHOU Tianyi,LONG Guodong,et al.Reinforced self-attention network:a hybrid of hard and soft attention for sequence modeling[EB/OL].[2019-04-10].https://arxiv.org/abs/1801.10296. [12] CHENG J,DONG L,LAPATA M,et al.Long short-term memory-networks for machine reading[EB/OL].[2019-04-10].https://arxiv.org/abs/1601.06733. [13] CHUNG J,GULCEHRE C,CHO K,et al.Empirical evaluation of gated recurrent neural networks on sequence modeling[EB/OL].[2019-04-10].https://arxiv.org/abs/1412.3555. [14] PARIKH A P,TACKSTROM O,DAS D,et al.A decomposable attention model for natural language inference[EB/OL].[2019-04-10].https://arxiv.org/abs/1606.01933v2. [15] WANG Z,HAMZA W,FLORIAN R,et al.Bilateral multi-perspective matching for natural language sentences[C]//Proceedings of IEEE International Joint Conference on Artificial Intelligence.Washington D.C.,USA:IEEE Press,2017:4144-4150. [16] KIM S,KANG I,KWAK N.Semantic sentence matching with densely-connected recurrent and co-attentive information[EB/OL].[2019-04-10].https://arxiv.org/abs/1805.11360. [17] BAHDANAU D,CHO K,BENGIO Y.Neural machine translation by jointly learning to align and translate[EB/OL].[2019-04-10].https://arxiv.org/abs/1409.0473. [18] HE H,LIN J J.Pairwise word interaction modeling with deep neural networks for semantic similarity measurement[C]//Proceedings of 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies.San Diego,USA:[s.n.],2016:937-948. [19] WILLIAMS A,NANGIA N,BOEMAN S R,et al.A broad-coverage challenge corpus for sentence understanding through inference[EB/OL].[2019-04-10].https://arxiv.org/abs/1704.05426v2. [20] PENNINGTON J,SOCHER R,MANNING C D,et al.GloVe:global vectors for word representation[EB/OL].[2019-04-10].https://www.aclweb.org/anthology/D14-1162. |