Author Login Editor-in-Chief Peer Review Editor Work Office Work

Computer Engineering ›› 2020, Vol. 46 ›› Issue (7): 91-97. doi: 10.19678/j.issn.1000-3428.0054953

• Artificial Intelligence and Pattern Recognition • Previous Articles     Next Articles

An Attention-enchanced Natural Language Reasoning Model

LI Guanyua,b, ZHANG Pengfeia,b, JIA Caiyana,b   

  1. a. School of Computer and Information Technology;b. Beijing Key Lab of Traffic Data Analysis and Mining, Beijing Jiaotong University, Beijing 100044, China
  • Received:2019-05-20 Revised:2019-08-09 Published:2019-08-20

一种注意力增强的自然语言推理模型

李冠宇a,b, 张鹏飞a,b, 贾彩燕a,b   

  1. 北京交通大学 a. 计算机与信息技术学院;b. 交通数据分析与挖掘北京市重点实验室, 北京 100044
  • 作者简介:李冠宇(1993-),男,硕士研究生,主研方向为自然语言处理、机器学习;张鹏飞,硕士研究生;贾彩燕,教授、博士。
  • 基金资助:
    国家自然科学基金(61876016);中央高校基本科研业务费专项资金(2017JBM023)。

Abstract: In natural language processing tasks,the attention mechanism can be used to evaluate the importance of a word.On this basis,this paper proposes an attention-enhanced natural language reasoning model,aESIM.The model adds the word attention layer and the adaptive direction weight layer to the bidirectional LSTM network of the ESIM model,so as to learn the representation of words and sentences more effectively,and increase the modelling efficiency of local inference between premises and hypothetical texts.Experimental results on datasets of SNLI,MultiNLI and Quora show that,compared with ESIM,HBMP,SSE and other models,aESIM increases the accuracy rate by 0.5%~1%.

Key words: natural language processing, natural language reasoning, ESIM model, attention mechanism, bidirectional LSTM network

摘要: 在自然语言处理任务中使用注意力机制可准确衡量单词重要度。为此,提出一种注意力增强的自然语言推理模型aESIM。将词注意力层以及自适应方向权重层添加到ESIM模型的双向LSTM网络中,从而更有效地学习单词与句子表示,同时提高前提与假设文本之间局部推理的建模效率。在SNLI、MultiNLI及Quora数据集上的实验结果表明,与ESIM、HBMP、SSE等模型相比,aESIM模型的准确率能够提升0.5%~1%。

关键词: 自然语言处理, 自然语言推理, ESIM模型, 注意力机制, 双向LSTM网络

CLC Number: