Embedding Learning with Triple Trustiness on Noisy Knowledge Graph
Embedding learning on knowledge graphs (KGs) aims to encode all entities and relationships into a continuous vector space, which provides an effective and flexible method to implement downstream knowledge-driven artificial intelligence (AI) and natural language processing (NLP) tasks. Since KG const...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-11-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/21/11/1083 |
id |
doaj-f1a166a82fe14ac0aca4d7cb78d94fae |
---|---|
record_format |
Article |
spelling |
doaj-f1a166a82fe14ac0aca4d7cb78d94fae2020-11-25T02:03:10ZengMDPI AGEntropy1099-43002019-11-012111108310.3390/e21111083e21111083Embedding Learning with Triple Trustiness on Noisy Knowledge GraphYu Zhao0Huali Feng1Patrick Gallinari2Financial Intelligence and Financial Engineering Key Laboratory of Sichuan Province, School of Economic Information Engineering, Southwestern University of Finance and Economics, Chengdu 611130, ChinaFinancial Intelligence and Financial Engineering Key Laboratory of Sichuan Province, School of Economic Information Engineering, Southwestern University of Finance and Economics, Chengdu 611130, ChinaLaboratoire d’Informatique de Paris 6 (LIP6), Universit Pierre et Marie Curie, 75252 Paris, FranceEmbedding learning on knowledge graphs (KGs) aims to encode all entities and relationships into a continuous vector space, which provides an effective and flexible method to implement downstream knowledge-driven artificial intelligence (AI) and natural language processing (NLP) tasks. Since KG construction usually involves automatic mechanisms with less human supervision, it inevitably brings in plenty of noises to KGs. However, most conventional KG embedding approaches inappropriately assume that all facts in existing KGs are completely correct and ignore noise issues, which brings about potentially serious errors. To address this issue, in this paper we propose a novel approach to learn embeddings with <b>triple trustiness</b> on KGs, which takes possible noises into consideration. Specifically, we calculate the trustiness value of triples according to the rich and relatively reliable information from large amounts of entity type instances and entity descriptions in KGs. In addition, we present a cross-entropy based loss function for model optimization. In experiments, we evaluate our models on KG noise detection, KG completion and classification. Through extensive experiments on three datasets, we demonstrate that our proposed model can learn better embeddings than all baselines on noisy KGs.https://www.mdpi.com/1099-4300/21/11/1083knowledge graphembedding learningcross entropynoise detectiontriple trustiness |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Yu Zhao Huali Feng Patrick Gallinari |
spellingShingle |
Yu Zhao Huali Feng Patrick Gallinari Embedding Learning with Triple Trustiness on Noisy Knowledge Graph Entropy knowledge graph embedding learning cross entropy noise detection triple trustiness |
author_facet |
Yu Zhao Huali Feng Patrick Gallinari |
author_sort |
Yu Zhao |
title |
Embedding Learning with Triple Trustiness on Noisy Knowledge Graph |
title_short |
Embedding Learning with Triple Trustiness on Noisy Knowledge Graph |
title_full |
Embedding Learning with Triple Trustiness on Noisy Knowledge Graph |
title_fullStr |
Embedding Learning with Triple Trustiness on Noisy Knowledge Graph |
title_full_unstemmed |
Embedding Learning with Triple Trustiness on Noisy Knowledge Graph |
title_sort |
embedding learning with triple trustiness on noisy knowledge graph |
publisher |
MDPI AG |
series |
Entropy |
issn |
1099-4300 |
publishDate |
2019-11-01 |
description |
Embedding learning on knowledge graphs (KGs) aims to encode all entities and relationships into a continuous vector space, which provides an effective and flexible method to implement downstream knowledge-driven artificial intelligence (AI) and natural language processing (NLP) tasks. Since KG construction usually involves automatic mechanisms with less human supervision, it inevitably brings in plenty of noises to KGs. However, most conventional KG embedding approaches inappropriately assume that all facts in existing KGs are completely correct and ignore noise issues, which brings about potentially serious errors. To address this issue, in this paper we propose a novel approach to learn embeddings with <b>triple trustiness</b> on KGs, which takes possible noises into consideration. Specifically, we calculate the trustiness value of triples according to the rich and relatively reliable information from large amounts of entity type instances and entity descriptions in KGs. In addition, we present a cross-entropy based loss function for model optimization. In experiments, we evaluate our models on KG noise detection, KG completion and classification. Through extensive experiments on three datasets, we demonstrate that our proposed model can learn better embeddings than all baselines on noisy KGs. |
topic |
knowledge graph embedding learning cross entropy noise detection triple trustiness |
url |
https://www.mdpi.com/1099-4300/21/11/1083 |
work_keys_str_mv |
AT yuzhao embeddinglearningwithtripletrustinessonnoisyknowledgegraph AT hualifeng embeddinglearningwithtripletrustinessonnoisyknowledgegraph AT patrickgallinari embeddinglearningwithtripletrustinessonnoisyknowledgegraph |
_version_ |
1724949002886053888 |