Unsupervised Word Embedding Learning by Incorporating Local and Global Contexts

Word embedding has benefited a broad spectrum of text analysis tasks by learning distributed word representations to encode word semantics. Word representations are typically learned by modeling local contexts of words, assuming that words sharing similar surrounding words are semantically close. We...

Full description

Bibliographic Details
Main Authors: Yu Meng, Jiaxin Huang, Guangyuan Wang, Zihan Wang, Chao Zhang, Jiawei Han
Format: Article
Language:English
Published: Frontiers Media S.A. 2020-03-01
Series:Frontiers in Big Data
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fdata.2020.00009/full