[1]BURGES C J C,BURGES C J C.Towards the machine comprehension of text:an essay,MSR-TR-2013-125[R].Microsoft Inc.,2013.
[2]BORDES A,USUNIER N,CHOPRA S,et al.Large-scale simple question answering with memory networks[EB/OL].[2017-10-20].https://arxiv.org/pdf/1506.02075v1.pdf.
[3]FERRUCCI D A,BROWN E W,CHU-CARROLL J,et al.Building Watson:an overview of the DeepQA project[J].AI Magazine,2010,31(3):59-79.
[4]李茹,马淑晖,张虎,等.阅读理解答案预测[J].山西大学学报(自然科学版),2017,40(4):763-770.
[5]BERANT J,CHOU A,FROSTIG R,et al.Semantic parsing on freebase from question-answer pairs[C]//Proceedings of EMNLP 2013.Seattle,USA:EMNLP,2013:6-17.
[6]刘飞龙,郝文宁,陈刚,等.基于双线性函数注意力Bi-LSTM模型的机器阅读理解[J].计算机科学,2017,44(s1):92-96.
[7]CUI Y,CHEN Z,WEI S,et al.Attention-over-attention neural networks for reading comprehension[EB/OL].[2017-10-20].https://arxiv.org/pdf/1607.04423.pdf.
[8]KADLEC R,SCHMID M,BAJGAR O,et al.Text understanding with the attention sum reader network[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.[S.l.]:ACL,2016:908-918.
[9]LIU T,CUI Y,YIN Q,et al.Generating and exploiting large-scale pseudo training data for zero pronoun resolution
[10]SHIN J,JIN K H,KIM Y.Adaptive support vector regression for UAV flight control[J].Neural Networks:The Official Journal of the International Neural Network Society,2011,24(1):109-120.
[11]LIPTON Z C,BERKOWITZ J,ELKAN C.A critical review of recurrent neural networks for sequence learning[EB/OL].[2017-10-20].https://arxiv.org/pdf/1506.00019.pdf.
[12]孟奎,刘梦赤,胡婕.基于字符级循环网络的查询意图识别模型[J].计算机工程,2017,43(3):181-186.
[13]SUTSKEVER I,VINYALS O,le QUOC V.Sequence to sequence learning with neural networks[C]//Proceedings of Annual Conference on Neural Information Processing Systems.Montreal,Canada:[s.n.],2014:3104-3112.
(下转第192页)
(上接第187页)
[14]WANG D,NYBERG E.A long short-term memory model for answer sentence selection in question answering[C]//Proceedings of Meeting of the Association for Computational Linguistics and the International Joint Conference on Natural Language Processing.[S.l.]:ACL,2015:707-712.
[15]GHOSH S,VINYALS O,STROPE B,et al.Contextual LSTM(CLSTM) models for large scale NLP tasks[EB/OL].[2017-10-20].https://arxiv.org/pdf/1602.06291.pdf.
[16]林纪诚.论篇章连贯性的条件[J].西安外国语学院学报,1987(2):1-7.
[17]刘群,李素建.基于《知网》的词汇语义相似度计算[J].中文计算语言学,2002,7(2):59-76.
[18]WANG T,CHO K.Larger-context language modelling with recurrent neural network[C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics.[S. l.]:ACL,2016:1319-1329.
|