计算机工程 ›› 2020, Vol. 46 ›› Issue (4): 40-45,52.doi: 10.19678/j.issn.1000-3428.0054272

• 人工智能与模式识别 • 上一篇    下一篇

基于BERT嵌入的中文命名实体识别方法

杨飘, 董文永   

  1. 武汉大学 计算机学院, 武汉 430072
  • 收稿日期:2019-03-18 修回日期:2019-04-29 出版日期:2020-04-15 发布日期:2019-05-31
  • 作者简介:杨飘(1995-),男,硕士研究生,主研方向为自然语言处理、深度学习;董文永(通信作者),教授、博士。
  • 基金项目:
    国家自然科学基金(61672024);国家重点研发计划"智能电网技术与装备"重点专项(2018YFB0904200)。

Chinese Named Entity Recognition Method Based on BERT Embedding

YANG Piao, DONG Wenyong   

  1. School of Computer Science, Wuhan University, Wuhan 430072, China
  • Received:2019-03-18 Revised:2019-04-29 Online:2020-04-15 Published:2019-05-31

摘要: 在基于神经网络的中文命名实体识别过程中,字的向量化表示是重要步骤,而传统的词向量表示方法只是将字映射为单一向量,无法表征字的多义性。针对该问题,通过嵌入BERT预训练语言模型,构建BERT-BiGRU-CRF模型用于表征语句特征。利用具有双向Transformer结构的BERT预训练语言模型增强字的语义表示,根据其上下文动态生成语义向量。在此基础上,将字向量序列输入BiGRU-CRF模型中进行训练,包括训练整个模型和固定BERT只训练BiGRU-CRF 2种方式。在MSRA语料上的实验结果表明,该模型2种训练方式的F1值分别达到95.43%和94.18%,优于BiGRU-CRF、Radical-BiLSTM-CRF和Lattice-LSTM-CRF模型。

关键词: 中文命名实体识别, BERT模型, BiGRU模型, 预训练语言模型, 条件随机场

Abstract: In Chinese Named Entity Recognition(NER) based on neural network,the vectorized representation of words is an important step.Traditional representation methods for word vectors only map a word to a single vector,and cannot represent the polysemy of a word.To address the problem,this paper introduces the BERT pretrained language model to build a BERT-BiGRU-CRF model for representation of sentence characteristics.The BERT pretrained language model with bidirectional Transformer structure is used to enhance the semantic representation of words and generate semantic vectors dynamically based on their context.On this basis,the word vector sequence is input into the BIGR-CRF model to train the whole model,or train the BIGR-CRF part only with BERT fixed.Experimental results on MSRA data show that the F1 value in the two training modes of this proposed model reaches 95.43% and 94.18% respectively,which is better than that of the BIGRU-CRF,the RADICAL-BILSTM-CRF and the GRAIN-LSTM-CRF models.

Key words: Chinese Named Entity Recognition(NER), BERT model, BiGRU model, pretrained language model, Conditional Random Field(CRF)

中图分类号: