Knowledge Graph Completion Based on Contrastive Learning and Language Model-Enhanced Embedding

A knowledge graph is a structured knowledge base comprising various types of knowledge or data units obtained through extraction and other processes. It is used to describe and represent information, such as entities, concepts, facts, and relationships. The limitations of Natural Language Processing...

Full description

Bibliographic Details
Published in:Jisuanji gongcheng
Main Author: ZHANG Hongchen, LI Linyu, YANG Li, SAN Chenjun, YIN Chunlin, YAN Bing, YU Hong, ZHANG Xuan
Format: Article
Language:English
Published: Editorial Office of Computer Engineering 2024-04-01
Subjects:
Online Access:https://www.ecice06.com/fileup/1000-3428/PDF/20240419.pdf
_version_ 1848669721538330624
author ZHANG Hongchen, LI Linyu, YANG Li, SAN Chenjun, YIN Chunlin, YAN Bing, YU Hong, ZHANG Xuan
author_facet ZHANG Hongchen, LI Linyu, YANG Li, SAN Chenjun, YIN Chunlin, YAN Bing, YU Hong, ZHANG Xuan
author_sort ZHANG Hongchen, LI Linyu, YANG Li, SAN Chenjun, YIN Chunlin, YAN Bing, YU Hong, ZHANG Xuan
collection DOAJ
container_title Jisuanji gongcheng
description A knowledge graph is a structured knowledge base comprising various types of knowledge or data units obtained through extraction and other processes. It is used to describe and represent information, such as entities, concepts, facts, and relationships. The limitations of Natural Language Processing(NLP) technology and the presence of noise in the texts of various knowledge or information units affect the accuracy of information extraction. Existing Knowledge Graph Completion(KGC) methods typically account for only single structural information or text semantic information, whereas the structural and text semantic information in the entire knowledge graph is disregarded. Hence, a KGC model based on contrastive learning and language model-enhanced embedding is proposed. The input entities and relationships are obtained using a pretrained language model to obtain the textual semantic information of the entities and relationships. The distance scoring function of the translation model is used to capture the structured information in the knowledge graph. Two negative sampling methods for contrastive learning are used to fuse contrastive learning to train the model to improve its ability to represent positive and negative samples. Experimental results show that compared with the Bidirectional Encoder Representations from Transformers for Knowledge Graph completion(KG-BERT) model, this model improves the average proportion of triple with ranking less than or equal to 10(Hits@10) indicator by 31% and 23% on the WN18RR and FB15K-237 datasets, respectively, thus demonstrating its superiority over other similar models.
format Article
id doaj-art-a710d3afdb1f4e36b4e3bebefe2bb2bc
institution Directory of Open Access Journals
issn 1000-3428
language English
publishDate 2024-04-01
publisher Editorial Office of Computer Engineering
record_format Article
spelling doaj-art-a710d3afdb1f4e36b4e3bebefe2bb2bc2025-10-28T06:01:35ZengEditorial Office of Computer EngineeringJisuanji gongcheng1000-34282024-04-0150416817610.19678/j.issn.1000-3428.0067543Knowledge Graph Completion Based on Contrastive Learning and Language Model-Enhanced EmbeddingZHANG Hongchen, LI Linyu, YANG Li, SAN Chenjun, YIN Chunlin, YAN Bing, YU Hong, ZHANG Xuan01. Policy Research and Enterprise Management Department, Yunnan Power Grid Co., Ltd., Kunming 650032, Yunnan, China;2. School of Software, Yunnan University, Kunming 650091, Yunnan, China;3. Electric Power Research Institute, Yunnan Power Grid Co., Ltd., Kunming 650217, Yunnan, China;4. Key Laboratory of Software Engineering of Yunnan Province, Kunming 650091, Yunnan, China;5. Engineering Research Center of Cyberspace, Kunming 650091, Yunnan, ChinaA knowledge graph is a structured knowledge base comprising various types of knowledge or data units obtained through extraction and other processes. It is used to describe and represent information, such as entities, concepts, facts, and relationships. The limitations of Natural Language Processing(NLP) technology and the presence of noise in the texts of various knowledge or information units affect the accuracy of information extraction. Existing Knowledge Graph Completion(KGC) methods typically account for only single structural information or text semantic information, whereas the structural and text semantic information in the entire knowledge graph is disregarded. Hence, a KGC model based on contrastive learning and language model-enhanced embedding is proposed. The input entities and relationships are obtained using a pretrained language model to obtain the textual semantic information of the entities and relationships. The distance scoring function of the translation model is used to capture the structured information in the knowledge graph. Two negative sampling methods for contrastive learning are used to fuse contrastive learning to train the model to improve its ability to represent positive and negative samples. Experimental results show that compared with the Bidirectional Encoder Representations from Transformers for Knowledge Graph completion(KG-BERT) model, this model improves the average proportion of triple with ranking less than or equal to 10(Hits@10) indicator by 31% and 23% on the WN18RR and FB15K-237 datasets, respectively, thus demonstrating its superiority over other similar models.https://www.ecice06.com/fileup/1000-3428/PDF/20240419.pdfknowledge graph completion(kgc)|knowledge graph|contrastive learning|pretrained language model|link prediction
spellingShingle ZHANG Hongchen, LI Linyu, YANG Li, SAN Chenjun, YIN Chunlin, YAN Bing, YU Hong, ZHANG Xuan
Knowledge Graph Completion Based on Contrastive Learning and Language Model-Enhanced Embedding
knowledge graph completion(kgc)|knowledge graph|contrastive learning|pretrained language model|link prediction
title Knowledge Graph Completion Based on Contrastive Learning and Language Model-Enhanced Embedding
title_full Knowledge Graph Completion Based on Contrastive Learning and Language Model-Enhanced Embedding
title_fullStr Knowledge Graph Completion Based on Contrastive Learning and Language Model-Enhanced Embedding
title_full_unstemmed Knowledge Graph Completion Based on Contrastive Learning and Language Model-Enhanced Embedding
title_short Knowledge Graph Completion Based on Contrastive Learning and Language Model-Enhanced Embedding
title_sort knowledge graph completion based on contrastive learning and language model enhanced embedding
topic knowledge graph completion(kgc)|knowledge graph|contrastive learning|pretrained language model|link prediction
url https://www.ecice06.com/fileup/1000-3428/PDF/20240419.pdf
work_keys_str_mv AT zhanghongchenlilinyuyanglisanchenjunyinchunlinyanbingyuhongzhangxuan knowledgegraphcompletionbasedoncontrastivelearningandlanguagemodelenhancedembedding