Multiple Interactive Attention Networks for Aspect-Based Sentiment Classification

Aspect-Based (also known as aspect-level) Sentiment Classification (ABSC) aims at determining the sentimental tendency of a particular target in a sentence. With the successful application of the attention network in multiple fields, attention-based ABSC has aroused great interest. However, most of...

Full description

Bibliographic Details
Main Authors: Dianyuan Zhang, Zhenfang Zhu, Qiang Lu, Hongli Pei, Wenqing Wu, Qiangqiang Guo
Format: Article
Language:English
Published: MDPI AG 2020-03-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/10/6/2052
id doaj-e35aa212a103412eb61ef7ec69a9a18c
record_format Article
spelling doaj-e35aa212a103412eb61ef7ec69a9a18c2020-11-25T02:01:59ZengMDPI AGApplied Sciences2076-34172020-03-01106205210.3390/app10062052app10062052Multiple Interactive Attention Networks for Aspect-Based Sentiment ClassificationDianyuan Zhang0Zhenfang Zhu1Qiang Lu2Hongli Pei3Wenqing Wu4Qiangqiang Guo5School of Information Science and Electrical Engineering, Shandong Jiao Tong University, Jinan 250357, ChinaSchool of Information Science and Electrical Engineering, Shandong Jiao Tong University, Jinan 250357, ChinaSchool of Information Science and Electrical Engineering, Shandong Jiao Tong University, Jinan 250357, ChinaSchool of Information Science and Electrical Engineering, Shandong Jiao Tong University, Jinan 250357, ChinaSchool of Information Science and Electrical Engineering, Shandong Jiao Tong University, Jinan 250357, ChinaSchool of Information Science and Electrical Engineering, Shandong Jiao Tong University, Jinan 250357, ChinaAspect-Based (also known as aspect-level) Sentiment Classification (ABSC) aims at determining the sentimental tendency of a particular target in a sentence. With the successful application of the attention network in multiple fields, attention-based ABSC has aroused great interest. However, most of the previous methods are difficult to parallelize, insufficiently obtain, and fuse the interactive information. In this paper, we proposed a Multiple Interactive Attention Network (MIN). First, we used the Bidirectional Encoder Representations from Transformers (BERT) model to pre-process the data. Then, we used the partial transformer to obtain a hidden state in parallel. Finally, we took the target word and the context word as the core to obtain and fuse the interactive information. Experimental results on the different datasets showed that our model was much more effective.https://www.mdpi.com/2076-3417/10/6/2052pre-trained bertnatural language processingaspect-based sentiment classificationattention mechanism
collection DOAJ
language English
format Article
sources DOAJ
author Dianyuan Zhang
Zhenfang Zhu
Qiang Lu
Hongli Pei
Wenqing Wu
Qiangqiang Guo
spellingShingle Dianyuan Zhang
Zhenfang Zhu
Qiang Lu
Hongli Pei
Wenqing Wu
Qiangqiang Guo
Multiple Interactive Attention Networks for Aspect-Based Sentiment Classification
Applied Sciences
pre-trained bert
natural language processing
aspect-based sentiment classification
attention mechanism
author_facet Dianyuan Zhang
Zhenfang Zhu
Qiang Lu
Hongli Pei
Wenqing Wu
Qiangqiang Guo
author_sort Dianyuan Zhang
title Multiple Interactive Attention Networks for Aspect-Based Sentiment Classification
title_short Multiple Interactive Attention Networks for Aspect-Based Sentiment Classification
title_full Multiple Interactive Attention Networks for Aspect-Based Sentiment Classification
title_fullStr Multiple Interactive Attention Networks for Aspect-Based Sentiment Classification
title_full_unstemmed Multiple Interactive Attention Networks for Aspect-Based Sentiment Classification
title_sort multiple interactive attention networks for aspect-based sentiment classification
publisher MDPI AG
series Applied Sciences
issn 2076-3417
publishDate 2020-03-01
description Aspect-Based (also known as aspect-level) Sentiment Classification (ABSC) aims at determining the sentimental tendency of a particular target in a sentence. With the successful application of the attention network in multiple fields, attention-based ABSC has aroused great interest. However, most of the previous methods are difficult to parallelize, insufficiently obtain, and fuse the interactive information. In this paper, we proposed a Multiple Interactive Attention Network (MIN). First, we used the Bidirectional Encoder Representations from Transformers (BERT) model to pre-process the data. Then, we used the partial transformer to obtain a hidden state in parallel. Finally, we took the target word and the context word as the core to obtain and fuse the interactive information. Experimental results on the different datasets showed that our model was much more effective.
topic pre-trained bert
natural language processing
aspect-based sentiment classification
attention mechanism
url https://www.mdpi.com/2076-3417/10/6/2052
work_keys_str_mv AT dianyuanzhang multipleinteractiveattentionnetworksforaspectbasedsentimentclassification
AT zhenfangzhu multipleinteractiveattentionnetworksforaspectbasedsentimentclassification
AT qianglu multipleinteractiveattentionnetworksforaspectbasedsentimentclassification
AT honglipei multipleinteractiveattentionnetworksforaspectbasedsentimentclassification
AT wenqingwu multipleinteractiveattentionnetworksforaspectbasedsentimentclassification
AT qiangqiangguo multipleinteractiveattentionnetworksforaspectbasedsentimentclassification
_version_ 1724954535603994624