Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features

In the process of knowledge graph construction, entity linking is a pivotal step, which maps mentions in text to a knowledge base. Existing models only utilize individual information to represent their latent features and ignore the correlation between entities and their mentions. Besides, in the pr...

Full description

Bibliographic Details
Main Authors: Shengze Hu, Zhen Tan, Weixin Zeng, Bin Ge, Weidong Xiao
Format: Article
Language:English
Published: MDPI AG 2019-04-01
Series:Symmetry
Subjects:
Online Access:https://www.mdpi.com/2073-8994/11/4/453
id doaj-01e4fc54293a4577b81e00efadf53299
record_format Article
spelling doaj-01e4fc54293a4577b81e00efadf532992020-11-24T20:54:35ZengMDPI AGSymmetry2073-89942019-04-0111445310.3390/sym11040453sym11040453Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural FeaturesShengze Hu0Zhen Tan1Weixin Zeng2Bin Ge3Weidong Xiao4Science and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, ChinaScience and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, ChinaScience and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, ChinaScience and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, ChinaScience and Technology on Information Systems Engineering Laboratory, National University of Defense Technology, Changsha 410073, ChinaIn the process of knowledge graph construction, entity linking is a pivotal step, which maps mentions in text to a knowledge base. Existing models only utilize individual information to represent their latent features and ignore the correlation between entities and their mentions. Besides, in the process of entity feature extraction, only partial latent features, i.e., context features, are leveraged to extract latent features, and the pivotal entity structural features are ignored. In this paper, we propose SA-ESF, which leverages the symmetrical Bi-LSTM neural network with the double attention mechanism to calculate the correlation between mentions and entities in two aspects: (1) entity embeddings and mention context features; (2) mention embeddings and entity description features; furthermore, the context features, structural features, and entity ID feature are integrated to represent entity embeddings jointly. Finally, we leverage (1) the similarity score between each mention and its candidate entities and (2) the prior probability to calculate the final ranking results. The experimental results on nine benchmark dataset validate the performance of SA-ESF where the average F1 score is up to 0.866.https://www.mdpi.com/2073-8994/11/4/453symmetrical neural networkentity linkingentity structural featuresprior probabilityinformation integration
collection DOAJ
language English
format Article
sources DOAJ
author Shengze Hu
Zhen Tan
Weixin Zeng
Bin Ge
Weidong Xiao
spellingShingle Shengze Hu
Zhen Tan
Weixin Zeng
Bin Ge
Weidong Xiao
Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features
Symmetry
symmetrical neural network
entity linking
entity structural features
prior probability
information integration
author_facet Shengze Hu
Zhen Tan
Weixin Zeng
Bin Ge
Weidong Xiao
author_sort Shengze Hu
title Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features
title_short Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features
title_full Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features
title_fullStr Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features
title_full_unstemmed Entity Linking via Symmetrical Attention-Based Neural Network and Entity Structural Features
title_sort entity linking via symmetrical attention-based neural network and entity structural features
publisher MDPI AG
series Symmetry
issn 2073-8994
publishDate 2019-04-01
description In the process of knowledge graph construction, entity linking is a pivotal step, which maps mentions in text to a knowledge base. Existing models only utilize individual information to represent their latent features and ignore the correlation between entities and their mentions. Besides, in the process of entity feature extraction, only partial latent features, i.e., context features, are leveraged to extract latent features, and the pivotal entity structural features are ignored. In this paper, we propose SA-ESF, which leverages the symmetrical Bi-LSTM neural network with the double attention mechanism to calculate the correlation between mentions and entities in two aspects: (1) entity embeddings and mention context features; (2) mention embeddings and entity description features; furthermore, the context features, structural features, and entity ID feature are integrated to represent entity embeddings jointly. Finally, we leverage (1) the similarity score between each mention and its candidate entities and (2) the prior probability to calculate the final ranking results. The experimental results on nine benchmark dataset validate the performance of SA-ESF where the average F1 score is up to 0.866.
topic symmetrical neural network
entity linking
entity structural features
prior probability
information integration
url https://www.mdpi.com/2073-8994/11/4/453
work_keys_str_mv AT shengzehu entitylinkingviasymmetricalattentionbasedneuralnetworkandentitystructuralfeatures
AT zhentan entitylinkingviasymmetricalattentionbasedneuralnetworkandentitystructuralfeatures
AT weixinzeng entitylinkingviasymmetricalattentionbasedneuralnetworkandentitystructuralfeatures
AT binge entitylinkingviasymmetricalattentionbasedneuralnetworkandentitystructuralfeatures
AT weidongxiao entitylinkingviasymmetricalattentionbasedneuralnetworkandentitystructuralfeatures
_version_ 1716793995092295680