Author Login Editor-in-Chief Peer Review Editor Work Office Work

Computer Engineering ›› 2022, Vol. 48 ›› Issue (8): 53-61. doi: 10.19678/j.issn.1000-3428.0062369

• Artificial Intelligence and Pattern Recognition • Previous Articles     Next Articles

Entity Relation Linking Based on Feature Joint and Multi-Attention

FU Lin1,2,3, LIU Zhao1,2, QIU Chen1,2,3,4, GAO Feng1,2,3,4   

  1. 1. School of Computer Science and Technology, Wuhan University of Science and Technology, Wuhan 430065, China;
    2. Hubei Province Key Laboratory of Intelligent Information Processing and Real-time Industrial, Wuhan 430065, China;
    3. Big Data Science and Engineering Research Institute, Wuhan University of Science and Technology, Wuhan 430065, China;
    4. Key Laboratory of Rich Media Digital Publishing Content Organization and Knowledge Service of Press and Publication Administration, Beijing 100083, China
  • Received:2021-08-16 Revised:2021-09-27 Published:2021-10-11

基于特征联合与多注意力的实体关系链接

付林1,2,3, 刘钊1,2, 邱晨1,2,3,4, 高峰1,2,3,4   

  1. 1. 武汉科技大学 计算机科学与技术学院, 武汉 430065;
    2. 湖北省智能信息处理与实时工业系统重点实验室, 武汉 430065;
    3. 武汉科技大学 大数据科学与工程研究院, 武汉 430065;
    4. 国家新闻出版署富媒体数字出版内容组织与知识服务重点实验室, 北京 100083
  • 作者简介:付林(1997-),女,硕士研究生,主研方向为知识图谱、智能信息处理、语义Web;刘钊(通信作者),教授;邱晨、高峰,博士。
  • 基金资助:
    国家自然科学基金“面向特定领域的知识图谱构建和应用关键技术研究”(U1836118);国家新闻出版署富媒体数字出版内容组织与知识服务重点实验室开放基金(ZD2021-11/01)。

Abstract: As the core component of question answering over knowledge base, entity linking and relation linking are often used as two independent tasks to connect the natural language questions to knowledge base information, which ignores the mutual influence between the information generated during the linking process.Moreover, the method for calculating the correlation between the question and the candidate entity or candidate relation ignores the internal connection between the candidate entity or candidate relation, respectively. This study proposes a method based on jointly feature and multi-attention for joint entity and relation linking to solve these problems.First, the question, entity, relation, and entity-relation pair are encoded using the neural network, and the neural network learns the vector representations of all the above. Next, the attention mechanism is added to obtain the weight information of the candidate entity and candidate relation in the question. Finally, the entity-relation pair vector is added when calculating the correlation between the entity (relation) vector and question vector, leveraging the information between entity-relation pair to improve the accuracy of the linking process. The experimental results for the LC-QuAD and QALD-7 datasets show that the method improves the linking accuracy by at least 1%, which is better than the recently improved method named Falcon.

Key words: Knowledge Base Question Answering(KBQA), joint entity and relation linking, entity-relation pair, attention mechanism, knowledge graph

摘要: 实体链接和关系链接作为知识库问答的核心组件链接自然语言问题和知识库信息,通常作为两个独立的任务执行,但该执行方式忽略了链接中产生的信息间的相互影响。同时,将候选实体和关系分别计算相似性的方法没有考虑候选实体和关系的内在联系。提出一种基于神经网络的特征联合和多注意力的实体关系链接方法,运用神经网络对问题、实体、关系以及实体-关系对进行编码和向量表示学习,通过添加注意力机制的方法获取候选实体及关系在问题中的权重信息,在计算实体(关系)向量与问题向量的相似性时加入实体-关系对向量,利用实体-关系对中包含的信息提高链接的精度。在LC-QuAD和QALD-7数据集上的实验结果表明,与Falcon模型相比,该方法至少提高了1%的链接精度。

关键词: 知识库问答, 联合实体关系链接, 实体-关系对, 注意力机制, 知识图谱

CLC Number: