作者投稿和查稿 主编审稿 专家审稿 编委审稿 远程编辑

计算机工程 ›› 2022, Vol. 48 ›› Issue (7): 66-72. doi: 10.19678/j.issn.1000-3428.0061432

• 人工智能与模式识别 • 上一篇    下一篇

基于Transformer编码器的中文命名实体识别模型

司逸晨, 管有庆   

  1. 南京邮电大学 物联网学院, 南京 210003
  • 收稿日期:2021-04-25 修回日期:2021-08-13 出版日期:2022-07-15 发布日期:2021-08-19
  • 作者简介:司逸晨(1996—),男,硕士研究生,主研方向为自然语言处理;管有庆,副教授、硕士。
  • 基金资助:
    江苏省高校自然科学研究项目(05KJD520146)。

Chinese Named Entity Recognition Model Based on Transformer Encoder

SI Yichen, GUAN Youqing   

  1. School of Internet of Things, Nanjing University of Posts and Telecommunications, Nanjing 210003, China
  • Received:2021-04-25 Revised:2021-08-13 Online:2022-07-15 Published:2021-08-19

摘要: 命名实体识别是自然语言处理中的重要任务,且中文命名实体识别相比于英文命名实体识别任务更具难度。传统中文实体识别模型通常基于深度神经网络对文本中的所有字符打上标签,再根据标签序列识别命名实体,但此类基于字符的序列标注方式难以获取词语信息。提出一种基于Transformer编码器的中文命名实体识别模型,在字嵌入过程中使用结合词典的字向量编码方法使字向量包含词语信息,同时针对Transformer编码器在注意力运算时丢失字符相对位置信息的问题,改进Transformer编码器的注意力运算并引入相对位置编码方法,最终通过条件随机场模型获取最优标签序列。实验结果表明,该模型在Resume和Weibo中文命名实体识别数据集上的F1值分别达到94.7%和58.2%,相比于基于双向长短期记忆网络和ID-CNN的命名实体识别模型均有所提升,具有更优的识别效果和更快的收敛速度。

关键词: 自然语言处理, 中文命名实体识别, Transformer编码器, 条件随机场, 相对位置编码

Abstract: Named Entity Recognition(NER) is an important task in Natural Language Processing(NLP), and compared with English NER, Chinese NER is often more difficult to achieve.Traditional Chinese entity recognition models are usually based on deep neural networks used to label all characters in the text.Although they identify named entities according to the label sequence, such character-based labeling methods have difficulty obtaining the word information.To address this problem, this paper proposes a Chinese NER model based on the Transformer encoder.In the word embedding layer of the model, the word vector coding method is used in combination with a dictionary, such that the char vector contains the word information.At the same time, to solve the problem in which the Transformer encoder loses the relative position information of the characters during an attention calculation, this paper modifies the attention calculation method of the Transformer encoder and introduces a relative position coding method.Finally, a Conditional Random Field(CRF) model is introduced to obtain the optimal tag sequence.The experimental results show that the F1 value of this model when applied to the Resume dataset reaches 94.7%, and on the Weibo dataset reaches 58.2%, which are improvements in comparison with traditional NER models based on a Bidirectional Long Short-Term Memory(BiLSTM) network and Iterated Dilated Convolution Neural Network(ID-CNN).In addition, it achieves a better recognition and faster convergence speed.

Key words: Natural Language Processing(NLP), Chinese Named Entity Recognition(NER), Transformer encoder, Conditional Random Field(CRF), relative position encoding

中图分类号: