Improving Word Embedding Using Variational Dropout

Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings. We present a novel method - Orthogonal Auto Encoder with Variational Dropout (OAEVD) for improving word embe...

詳細記述

書誌詳細
出版年:Proceedings of the International Florida Artificial Intelligence Research Society Conference
主要な著者: Zainab Albujasim, Diana Inkpen, Xuejun Han, Yuhong Guo
フォーマット: 論文
言語:英語
出版事項: LibraryPress@UF 2023-05-01
主題:
オンライン・アクセス:https://journals.flvc.org/FLAIRS/article/view/133326
その他の書誌記述
要約:Pre-trained word embeddings are essential in natural language processing (NLP). In recent years, many post-processing algorithms have been proposed to improve the pre-trained word embeddings. We present a novel method - Orthogonal Auto Encoder with Variational Dropout (OAEVD) for improving word embeddings based on orthogonal autoencoders and variational dropout.  Specifically, the orthogonality constraint encourages more diversity in the latent space and increases semantic similarities between similar words, and variational dropout makes it more robust to overfitting.   Empirical evaluation on a range of downstream NLP tasks, including semantic similarity, text classification, and concept categorization shows that our proposed method effectively improves the quality of pre-trained word embeddings. Moreover, the proposed method successfully reduces the dimensionality of pre-trained word embeddings while maintaining high performance.
ISSN:2334-0754
2334-0762