作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程 ›› 2020, Vol. 46 ›› Issue (6): 296-302. doi: 10.19678/j.issn.1000-3428.0054431

• 开发研究与工程应用 • 上一篇    下一篇

融合双向GRU与注意力机制的医疗实体关系识别

张志昌, 周侗, 张瑞芳, 张敏钰   

  1. 西北师范大学 计算机科学与工程学院, 兰州 730070
  • 收稿日期:2019-03-28 修回日期:2019-06-03 发布日期:2019-07-12
  • 作者简介:张志昌(1976-),男,教授、博士,主研方向为医疗文本处理、问答技术;周侗、张瑞芳、张敏钰,硕士研究生。
  • 基金资助:
    国家自然科学基金(61762081,61662067,61662068);甘肃省重点研发计划项目(2017GS10781)。

Medical Entity Relation Recognition Combining Bidirectional GRU and Attention

ZHANG Zhichang, ZHOU Tong, ZHANG Ruifang, ZHANG Minyu   

  1. School of Computer Science and Engineering, Northwest Normal University, Lanzhou 730070, China
  • Received:2019-03-28 Revised:2019-06-03 Published:2019-07-12

摘要: 传统的实体关系识别方法多数是以单个句子作为处理单元,难以解决训练语料中实体关系标签标注错误的问题,且没有充分利用包含实体信息的多个句子在分类实体关系时的相互增强作用。为此,提出一种双向门控循环单元(GRU)和双重注意力机制结合的中文电子病历医疗实体关系识别方法。构建BiGRU-Dual Attention模型,采用双向GRU学习字的上下文信息,以获取更细粒度的特征,通过引入字级注意力机制提高对关系识别起决定作用的字权重,同时利用句子级注意力机制从多个句子中获取可增强识别性能的特征,降低标注错误的句子对分类的影响。实验结果表明,与BiLSTM-Attention模型相比,该模型的F1值提高了3.97%,达到了82.17%。

关键词: 中文电子病历, 医疗实体关系抽取, 双向门控循环单元, 双重注意力机制, 深度学习

Abstract: Most of existing methods for entity relationship recognition take a single sentence as processing unit,and fail to address tagging errors of entity relationships in the training corpus.Also,they cannot make full use of the mutual reinforcement of multiple sentences that contain entity information in relationship recognition.Therefore,this paper proposes a recognition method based on bidirectional Gated Recurrent Unit(GRU) and dual attention mechanism for entity relationships of Chinese electronic medical records.This paper proposes a BiGRU-Dual Attention model,and uses bidirectional GRU to learn the context information of characters in order to obtain more fine-grained features.Then the character-level attention mechanism is introduced to improve the weight of the characters that are key to relation recognition.Also,the sentence-level attention mechanism is employed to capture the features that can enhance recognition performance from multiple sentences,so as to reduce the weight of mislabeled sentences.Experimental results show that compared with the mainstream BiLSTM-Attention model,the proposed model increases the F1 value by 3.97% to 82.17%.

Key words: Chinese Electronic Medical Records(EMR), medical entity relation extraction, bidirectional Gated Recurrent Unit(GRU), dual attention mechanism, deep learning

中图分类号: