Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis

Long short-term memory networks (LSTMs) have gained good performance in sentiment analysis tasks. The general method is to use LSTMs to combine word embeddings for text representation. However, word embeddings carry more semantic information rather than sentiment information. Only using word embeddi...

Full description

Bibliographic Details
Main Authors: Xianghua Fu, Jingying Yang, Jianqiang Li, Min Fang, Huihui Wang
Format: Article
Language:English
Published: IEEE 2018-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8513826/
id doaj-9016b5090dae45578fe2efb4ba790e6a
record_format Article
spelling doaj-9016b5090dae45578fe2efb4ba790e6a2021-03-29T21:34:40ZengIEEEIEEE Access2169-35362018-01-016718847189110.1109/ACCESS.2018.28784258513826Lexicon-Enhanced LSTM With Attention for General Sentiment AnalysisXianghua Fu0https://orcid.org/0000-0003-4431-3386Jingying Yang1Jianqiang Li2Min Fang3Huihui Wang4Faculty of Arts and Sciences, Shenzhen Technology University, Shenzhen, ChinaCollege of Computer Science and Software Engineering, Shenzhen University, Shenzhen, ChinaCollege of Computer Science and Software Engineering, Shenzhen University, Shenzhen, ChinaExperimental and Creative Practice Education Center, Harbin Institute of Technology, Shenzhen, ChinaDepartment of Engineering, Jacksonville University, Jacksonbville, FL, USALong short-term memory networks (LSTMs) have gained good performance in sentiment analysis tasks. The general method is to use LSTMs to combine word embeddings for text representation. However, word embeddings carry more semantic information rather than sentiment information. Only using word embeddings to represent words is inaccurate in sentiment analysis tasks. To solve the problem, we propose a lexicon-enhanced LSTM model. The model first uses sentiment lexicon as an extra information pre-training a word sentiment classifier and then get the sentiment embeddings of words including the words not in the lexicon. Combining the sentiment embedding and its word embedding can make word representation more accurate. Furthermore, we define a new method to find the attention vector in general sentiment analysis without a target that can improve the LSTM ability in capturing global sentiment information. The results of experiments on English and Chinese datasets show that our models have comparative or better results than the existing models.https://ieeexplore.ieee.org/document/8513826/Sentiment lexiconsentiment embeddingword embeddingattention vectorsentiment analysis
collection DOAJ
language English
format Article
sources DOAJ
author Xianghua Fu
Jingying Yang
Jianqiang Li
Min Fang
Huihui Wang
spellingShingle Xianghua Fu
Jingying Yang
Jianqiang Li
Min Fang
Huihui Wang
Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis
IEEE Access
Sentiment lexicon
sentiment embedding
word embedding
attention vector
sentiment analysis
author_facet Xianghua Fu
Jingying Yang
Jianqiang Li
Min Fang
Huihui Wang
author_sort Xianghua Fu
title Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis
title_short Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis
title_full Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis
title_fullStr Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis
title_full_unstemmed Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis
title_sort lexicon-enhanced lstm with attention for general sentiment analysis
publisher IEEE
series IEEE Access
issn 2169-3536
publishDate 2018-01-01
description Long short-term memory networks (LSTMs) have gained good performance in sentiment analysis tasks. The general method is to use LSTMs to combine word embeddings for text representation. However, word embeddings carry more semantic information rather than sentiment information. Only using word embeddings to represent words is inaccurate in sentiment analysis tasks. To solve the problem, we propose a lexicon-enhanced LSTM model. The model first uses sentiment lexicon as an extra information pre-training a word sentiment classifier and then get the sentiment embeddings of words including the words not in the lexicon. Combining the sentiment embedding and its word embedding can make word representation more accurate. Furthermore, we define a new method to find the attention vector in general sentiment analysis without a target that can improve the LSTM ability in capturing global sentiment information. The results of experiments on English and Chinese datasets show that our models have comparative or better results than the existing models.
topic Sentiment lexicon
sentiment embedding
word embedding
attention vector
sentiment analysis
url https://ieeexplore.ieee.org/document/8513826/
work_keys_str_mv AT xianghuafu lexiconenhancedlstmwithattentionforgeneralsentimentanalysis
AT jingyingyang lexiconenhancedlstmwithattentionforgeneralsentimentanalysis
AT jianqiangli lexiconenhancedlstmwithattentionforgeneralsentimentanalysis
AT minfang lexiconenhancedlstmwithattentionforgeneralsentimentanalysis
AT huihuiwang lexiconenhancedlstmwithattentionforgeneralsentimentanalysis
_version_ 1724192670193025024