Adversarial Knowledge Representation Learning Without External Model

Knowledge representation learning, which embeds entities and relations of knowledge graph into low-dimensional vectors, is efficient for predicting missing facts. Knowledge graph datasets only store positive triplets. Nevertheless, negative cases are similarly crucial in knowledge representation lea...

Full description

Bibliographic Details
Main Authors: Jingpei Lei, Dantong Ouyang, Ying Liu
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8599182/
id doaj-c079657900344aefb24ee3a81f6a185f
record_format Article
spelling doaj-c079657900344aefb24ee3a81f6a185f2021-03-29T22:12:40ZengIEEEIEEE Access2169-35362019-01-0173512352410.1109/ACCESS.2018.28894818599182Adversarial Knowledge Representation Learning Without External ModelJingpei Lei0https://orcid.org/0000-0001-7788-2329Dantong Ouyang1Ying Liu2College of Computer Science and Technology, Jilin University, Changchun, ChinaCollege of Computer Science and Technology, Jilin University, Changchun, ChinaCollege of Computer Science and Technology, Jilin University, Changchun, ChinaKnowledge representation learning, which embeds entities and relations of knowledge graph into low-dimensional vectors, is efficient for predicting missing facts. Knowledge graph datasets only store positive triplets. Nevertheless, negative cases are similarly crucial in knowledge representation learning. Conventionally, corrupted triplets are uniformly generated as negative cases, but actually, these corrupted triplets are heterogeneous. The majority of corrupted triplets are trivial, and they have limited influence on learning. Regarding the large number of corrupted triplet candidates, it is not efficient to train the model by uniformly generated corrupted triplets. Generative adversarial network (GAN)-inspired approaches are proposed to remit easily discriminated negative training examples, enabling faster and better convergence of the embedding models. Pre-trained external sampling models are required in these approaches. In this paper, we introduce a simple but strong negative sampling approach for adversarial knowledge representation learning, named loss adaptive sampling mechanism, which is efficient without an external sampling model. Furthermore, false negative cases are always over-trained in the training stage with efficient negative sampling approaches. We propose a push-up mechanism and verify whether it is feasible to alleviate these over-trained false negative cases. The experimental results show that our adversarial knowledge representation learning approach outperforms the GAN-based sampling method—KBGAN.https://ieeexplore.ieee.org/document/8599182/knowledge graphknowledge representation learninglink predictionAdversarial learningnegative sampling
collection DOAJ
language English
format Article
sources DOAJ
author Jingpei Lei
Dantong Ouyang
Ying Liu
spellingShingle Jingpei Lei
Dantong Ouyang
Ying Liu
Adversarial Knowledge Representation Learning Without External Model
IEEE Access
knowledge graph
knowledge representation learning
link prediction
Adversarial learning
negative sampling
author_facet Jingpei Lei
Dantong Ouyang
Ying Liu
author_sort Jingpei Lei
title Adversarial Knowledge Representation Learning Without External Model
title_short Adversarial Knowledge Representation Learning Without External Model
title_full Adversarial Knowledge Representation Learning Without External Model
title_fullStr Adversarial Knowledge Representation Learning Without External Model
title_full_unstemmed Adversarial Knowledge Representation Learning Without External Model
title_sort adversarial knowledge representation learning without external model
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2019-01-01
description Knowledge representation learning, which embeds entities and relations of knowledge graph into low-dimensional vectors, is efficient for predicting missing facts. Knowledge graph datasets only store positive triplets. Nevertheless, negative cases are similarly crucial in knowledge representation learning. Conventionally, corrupted triplets are uniformly generated as negative cases, but actually, these corrupted triplets are heterogeneous. The majority of corrupted triplets are trivial, and they have limited influence on learning. Regarding the large number of corrupted triplet candidates, it is not efficient to train the model by uniformly generated corrupted triplets. Generative adversarial network (GAN)-inspired approaches are proposed to remit easily discriminated negative training examples, enabling faster and better convergence of the embedding models. Pre-trained external sampling models are required in these approaches. In this paper, we introduce a simple but strong negative sampling approach for adversarial knowledge representation learning, named loss adaptive sampling mechanism, which is efficient without an external sampling model. Furthermore, false negative cases are always over-trained in the training stage with efficient negative sampling approaches. We propose a push-up mechanism and verify whether it is feasible to alleviate these over-trained false negative cases. The experimental results show that our adversarial knowledge representation learning approach outperforms the GAN-based sampling method—KBGAN.
topic knowledge graph
knowledge representation learning
link prediction
Adversarial learning
negative sampling
url https://ieeexplore.ieee.org/document/8599182/
work_keys_str_mv AT jingpeilei adversarialknowledgerepresentationlearningwithoutexternalmodel
AT dantongouyang adversarialknowledgerepresentationlearningwithoutexternalmodel
AT yingliu adversarialknowledgerepresentationlearningwithoutexternalmodel
_version_ 1724192101808209920