[1]RUSH A M,CHOPRA S,WESTON J,et al.A neural attention model for abstractive sentence summariza-tion[C]//Proceedingsof Conference on Empirical Methods in Natural Language Processing.Berlin,Germany:Springer,2015:379-389.br/
[2]CHOPRA S,AULI M,RUSH A M,et al.Abstractive sentence summarization with attentive recurrent neural networks[C]//Proceedings of Conference on North American Chapter of the Association for Computational Linguistics.New York,USA:ACM
Press,2016:93-98.br/
[3]YIN Wenpeng,PEI Yulong.Optimizing sentence modeling and selection for document summarization[C]//Proceedings of International Joint Conference on Artificial Intelligence.Washington D.C.,USA:IEEE Press,2015:1383-1389.br/
[4]TAN Ming,SANTOS C N,XIANG Bing,et al.Improved representation learning for question answer matching[C]//Proceedings of Meeting of the Association for Computational Linguistics.Washington D.C.,USA:IEEE Press,2016:464-473.br/
[5]WANG Di,NYBERG E.A long short-term memory model for answer sentence selection in question answering[C]//Proceedings of Meeting of the Association for Computational Linguistics.Washington D.C.,USA:IEEE Press,2015:707-712.br/
[6]SUTSKEVER I,VINYALS O,LEQ V,et al.Sequence to sequence learning with neural networks[C]//Proceedings of Advances in Neural Information Processing Systems.Berlin,Germany:Springer,2014:3104-3112.br/
[7]BAHDANAU D,CHO K,BENGIO Y,et al.Neural machine translation by jointly learning to align and translate[EB/OL].[2017-09-18].http://arxiv.org/abs/1409.0473.br/
[8]YIH W,CHANG M,MEEK C,et al.Question answering using enhanced lexical semantic models[C]//Proceedings of Meeting of the Association for Computational Linguistics. Berlin,Germany:Springer,2013:1744-1753.br/
[9]NARASIMHAN K,BARZILAY R.Machine comprehension with discourse relation[C]//Proceedings of the Asso-ciation for Computational Linguistics.Berlin,Germany:Springer,2015:1253-1262.br/
[10]SACHAN M,DUBEY A,XING E P,et al.Learning answer-entailing structures for machine comprehen-sion[C]//Proceedings of Meeting of the Association for Computational Linguistics.Berlin,Germany:Springer,2015:239-249.br/
[11]郭少茹,张虎,钱揖丽,等.面向高考阅读理解的句子语义相关度[J].清华大学学报 (自然科学版),2017,57(6):575-579.br/
[12]YU L,HERMANN K M,BLUNSOM P,et al.Deep learning for answer sentence selection[EB/OL].[2017-09-18].http://arxiv.org/abs/1412.1632.br/
[13]RICHARDSON M,BURGES C J C,RENSHAW E.MCTest:a challenge dataset for the open-domain machine comprehension of text[C]//Proceedings of Conference on Empirical Methods in Natural Language Processing.Berlin,Germany:Springer,2013:193-
203.br/
[14]HERMANN K M,KOCISK T,GREFENSTETTE E,et al.Teaching machines to read and comprehend[C]]//Proceedings of Conference on Neural Information Processing Systems.New York,USA:ACM Press,2015:1693-1701.br/
[15]HILL F,BORDES A,CHOPRAS,et al.The goldilocks principle:reading children's books with explicit memory representations[EB/OL].[2017-09-18].https://arxiv.org/abs/1511.02301.br/
[16]CUI Yiming,LIU Ting,CHEN Zhipeng,et al.Consensus attention-based neural networks for Chinese reading comprehension[C]//Proceedings of International Conference on Computational Linguistics.Berlin,Germany:Springer,2016:1777-1786.br/
[17]HU Baotian,CHEN Qingcai,ZHU Fangze,et al.LCSTS:a large scale Chinese short text summarization dataset[EB/OL].[2017-09-18].https://arxiv.org/abs/1506.05865.br/
[18]MIKOLOV T,CHEN Kai,CORRADO G,et al.Efficient estimation of word representations in vector space[EB/OL].[2017-09-18].https://arxiv.org/abs/1301.3781.br/
[19]FENG Minwei,XIANG Bing,GLASS M R,et al.Applying deep learning to answer selection:a study and an open task[C]//Proceedings of IEEE Automatic Speech Recognition and Understanding Workshop.Washington D.C.,USA:IEEE Press,2015:813-
820.br/ |