Please wait a minute...
Data Analysis and Knowledge Discovery  2021, Vol. 5 Issue (12): 37-47    DOI: 10.11925/infotech.2096-3467.2021.0554
Current Issue | Archive | Adv Search |
Aspect-Level Sentiment Analysis Based on BAGCNN
Yu Bengong1,2(),Zhang Shuwen1
1School of Management, Hefei University of Technology, Hefei 230009, China
2Key Laboratory of Process Optimization & Intelligent Decision-making (Hefei University of Technology),Ministry of Education, Hefei 230009, China
Download: PDF (884 KB)   HTML ( 18
Export: BibTeX | EndNote (RIS)      
Abstract  

[Objective] This paper proposes a BERT-based Attention Gated Convolution Neural Network model (BAGCNN), aiming to improve the traditional aspect-level sentiment analysis algorithm. [Methods] First, the pre-trained BERT model generated feature representation for the texts and aspect words. Then, we introduced the Multi Head Self-attention Mechanism to solve the problem of long-distance dependence of aspect words. Finally, we selectively extracted the multi-level context features paralleling the aspect words with the Gated Convolution Neural Network. [Results] Compared to the benchmark model, the accuracy of our new model was improved by 4.24, 4.01 and 3.89 percentage points on restaurant, laptop and twitter datasets. The size of the downstream parallel structure of the model was also reduced by 1.27 MB. [Limitations] The proposed model did not work well with data sets having significantly different text length. [Conclusions] The new BAGCNN model could effectively remove the context information irrelevant to the aspect words.

Key wordsAspect-Level Sentiment Analysis      BERT      Multi Head Self-attention Mechanism      Gated Convolution Neural Network     
Received: 13 June 2021      Published: 20 January 2022
ZTFLH:  TP391  
Fund:National Natural Science Foundation of China(72071061);National Natural Science Foundation of China(71671057)
Corresponding Authors: Yu Bengong,ORCID:0000-0003-4170-2335     E-mail: bgyu@hfut.edu.cn

Cite this article:

Yu Bengong, Zhang Shuwen. Aspect-Level Sentiment Analysis Based on BAGCNN. Data Analysis and Knowledge Discovery, 2021, 5(12): 37-47.

URL:

https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/10.11925/infotech.2096-3467.2021.0554     OR     https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/Y2021/V5/I12/37

Aspect-Level Sentiment Analysis Model Based on BAGCNN
数据集 类型 positive neural negative
Restaurant train 2 164 637 807
test 728 196 196
Laptop train 994 464 879
test 341 169 128
Twitter train 1 561 3 127 1 560
test 173 346 173
Dataset Statistics
参数
dropout rate 0.1
batch size 32
epoch 6
optimizer Adam
learning rate 2e-5
Hyperparameter Setting
Model Restaurant Laptop Twitter
Accuracy F1 Accuracy F1 Accuracy F1
GCAE 77.32 63.78 66.04 57.35 72.16 69.87
AGCN 78.84 68.78 72.10 67.52 70.28 68.56
Mul-AT-CNN 79.46* - 75.39* - 71.25* -
AOA 79.46 68.80 73.51 68.18 71.96 69.91
IAN 79.46 69.75 72.47 67.18 69.85 68.27
MAN 80.71* 70.95* 74.13* 71.93* 72.12* 70.13*
ASGCN 81.28 71.74 74.81 70.74 72.40 70.68
BERT-SPC 83.02 74.55 78.06 73.61 73.12 71.83
BERT-AEN 81.98 72.16 77.18 73.83 72.98 71.74
TD-BERT 84.11 76.96 77.43 73.54 75.43 74.38
Our(BAGCNN) 84.95 77.90 78.14 74.14 76.01 74.61
Model Comparison Results(%)
Influence of Convolution Kernel Size on Model
The Influence of Different Word Embedding Coding Methods on the Model
The Influence of Different Feature Extraction Methods on Model
Models Params× 10 6 Memory(MB)
ATAE-LSTM 2.53 13.76
AOA 2.89 15.15
IAN 2.17 12.40
GloVe-LSTM-ATT 2.07 16.90
AEN-GloVe 1.16 11.04
ASGCN 0.45 5.84
AGCN 1.36 13.42
GCAE 0.82 11.38
Our(GloVe-AGCNN) 1.02 11.13
Model Sizes
The Effect of Gated Convolution
文本 方法 结果
1. Even when the chef is not in the house, the food and service are right on target. Aspect chef(O) food(P) service(P)
BERT-SPC ×
BERT-AEN × ×
TD-BERT ×
Our(BAGCNN)
2. Food was average and creme brulee was awful - the sugar was charred, not caramelized and smelled of kerosene. Aspect Food(O) brulee(N) sugar(N)
BERT-SPC ×
BERT-AEN ×
TD-BERT ×
Our(BAGCNN) ×
3. The food is uniformly exceptional, with a very capable kitchen which will proudly whip up whatever you feel like eating, whether it 's on the menu or not. Aspect food(P) kitchen(P) menu(O)
BERT-SPC ×
BERT-AEN ×
TD-BERT
Our(BAGCNN)
Prediction Results of the Models
[1] 赵明清, 武圣强. 基于微博情感分析的股市加权预测方法研究[J]. 数据分析与知识发现, 2019, 3(2): 43-51.
[1] (Zhao Mingqing, Wu Shengqiang. Research on Stock Market Weighted Prediction Method Based on Micro-blog Sentiment Analysis[J]. Data Analysis and Knowledge Discovery, 2019, 3(2): 43-51.)
[2] 李铁军, 颜端武, 杨雄飞. 基于情感加权关联规则的微博推荐研究[J]. 数据分析与知识发现, 2020, 4(4): 27-33.
[2] (Li Tiejun, Yan Duanwu, Yang Xiongfei. Recommending Microblogs Based on Emotion-Weighted Association Rules[J]. Data Analysis and Knowledge Discovery, 2020, 4(4): 27-33.)
[3] 曾子明, 万品玉. 基于双层注意力和Bi-LSTM的公共安全事件微博情感分析[J]. 情报科学, 2019, 37(6): 23-29.
[3] (Zeng Ziming, Wan Pinyu. Sentiment Analysis of Public Safety Events in Micro-blog Based on Double-layered Attention and Bi-LSTM[J]. Information Science, 2019, 37(6): 23-29.)
[4] Kiritchenko S, Zhu X, Cherry C, et al. Nrc-canada-2014: Detecting Aspects and Sentiment in Customer Reviews [C]//Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval 2014). 2014: 437-442.
[5] Mikolov T, Sutskever I, Chen K, et al. Distributed Representations of Words and Phrases and Their Compositionality [C]//Proceedings of the 26th International Conference on Neural Information Processing Systems. 2013: 3111-3119.
[6] Pennington J, Socher R, Manning C D. GloVe: Global Vectors for Word Representation [C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2014: 1532-1543.
[7] Devlin J, Chang M W, Lee K, et al. Bert: Pre-training of Deep Bidirectional Transformers for Language Understanding [C]//Proceeding of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2019: 4171-4186.
[8] Peters M E, Neumann M, Iyyer M, et al. Deep Contextualized Word Representations [C]//Proceeding of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. 2018: 2227-2237.
[9] Hoang M, Bihorac O A, Rouces J. Aspect-Based Sentiment Analysis Using BERT [C]//Proceedings of the 22nd Nordic Conference on Computational Linguistics. 2019: 187-196.
[10] Gao Z, Feng A, Song X, et al. Target-dependent Sentiment Classification with BERT[J]. IEEE Access, 2019, 7: 154290-154299.
doi: 10.1109/Access.6287639
[11] Xu Q, Zhu L, Dai T, et al. Aspect-Based Sentiment Classification with Multi-Attention Network[J]. Neurocomputing, 2020, 388: 135-143.
doi: 10.1016/j.neucom.2020.01.024
[12] Hu M, Zhao S, Zhang L, et al. CAN: Constrained Attention Networks for Multi-Aspect Sentiment Analysis [C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019: 4601-4610.
[13] Song Y, Wang J, Jiang T, et al. Targeted Sentiment Classification with Attentional Encoder Network [C]//Proceedings of the 28th International Conference on Artificial Neural Networks. Springer, Cham, 2019: 93-103.
[14] Zhao F, Wu Z, Dai X. Attention Transfer Network for Aspect-level Sentiment Classification [C]//Proceedings of the 28th International Conference on Computational Linguistics. 2020: 811-821.
[15] Zhang C, Li Q, Song D. Aspect-based Sentiment Classification with Aspect-specific Graph Convolutional Networks [C]//Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP). 2019: 4568-4578.
[16] Chen C, Teng Z, Zhang Y. Inducing Target-Specific Latent Structures for Aspect Sentiment Classification [C]//Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP). 2020: 5596-5607.
[17] Wang K, Shen W, Yang Y, et al. Relational Graph Attention Network for Aspect-based Sentiment Analysis [C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020: 3229-3238.
[18] Kipf T N, Welling M. Semi-Supervised Classification with Graph Convolutional Networks[OL]. arXiv Preprint, arXiv: 1609.02907.
[19] Veličković P, Cucurull G, Casanova A, et al. Graph Attention Networks[OL]. arXiv Preprint, arXiv: 1710.10903.
[20] Xue W, Li T. Aspect Based Sentiment Analysis with Gated Convolutional Networks [C]//Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018: 2514-2523.
[21] Phan M H, Ogunbona P O. Modelling Context and Syntactical Features for Aspect-Based Sentiment Analysis [C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. 2020: 3211-3220.
[22] Zhang S, Xu X, Pang Y, et al. Multi-layer Attention Based CNN for Target-Dependent Sentiment Classification[J]. Neural Processing Letters, 2020, 51(3): 2089-2103.
doi: 10.1007/s11063-019-10017-9
[23] 曹卫东, 李嘉琪, 王怀超. 采用注意力门控卷积网络模型的目标情感分析[J]. 西安电子科技大学学报, 2019, 46(6): 30-36.
[23] (Cao Weidong, Li Jiaqi, Wang Huaichao. Analysis of Targeted Sentiment by the Attention Gated Convolutional Network Model[J]. Journal of Xidian University, 2019, 46(6): 30-36.)
[24] Liu N, Shen B. Aspect-based Sentiment Analysis with Gated Alternate Neural Network[J]. Knowledge-Based Systems, 2020, 188: 105010.
doi: 10.1016/j.knosys.2019.105010
[25] Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need [C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 6000-6010.
[26] Dong L, Wei F, Tan C, et al. Adaptive Recursive Neural Network for Target-Dependent Twitter Sentiment Classification [C]//Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. 2014: 49-54.
[27] Huang B, Ou Y, Carley K M. Aspect Level Sentiment Classification with Attention-over-Attention Neural Networks [C]//Proceedings of the 11th International Conference on Social Computing, Behavioral-Cultural Modeling and Prediction and Behavior Representation in Modeling and Simulation. Springer, Cham, 2018: 197-206.
[28] Ma D, Li S, Zhang X, et al. Interactive Attention Networks for Aspect-Level Sentiment Classification [C]//Proceedings of the 26th International Joint Conference on Artificial Intelligence. 2017: 4068-4074.
[29] Jiang N, Tian F, Li J, et al. MAN: Mutual Attention Neural Networks Model for Aspect-Level Sentiment Classification in SIoT[J]. IEEE Internet of Things Journal, 2020, 7(4): 2901-2913.
doi: 10.1109/JIoT.6488907
[30] Wang Y, Huang M, Zhu X, et al. Attention-based LSTM for Aspect-Level Sentiment Classification [C]//Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing. 2016: 606-615.
[1] Chen Jie,Ma Jing,Li Xiaofeng. Short-Text Classification Method with Text Features from Pre-trained Models[J]. 数据分析与知识发现, 2021, 5(9): 21-30.
[2] Zhou Zeyu,Wang Hao,Zhao Zibo,Li Yueyan,Zhang Xiaoqin. Construction and Application of GCN Model for Text Classification with Associated Information[J]. 数据分析与知识发现, 2021, 5(9): 31-41.
[3] Ma Jiangwei, Lv Xueqiang, You Xindong, Xiao Gang, Han Junmei. Extracting Relationship Among Military Domains with BERT and Relation Position Features[J]. 数据分析与知识发现, 2021, 5(8): 1-12.
[4] Li Wenna, Zhang Zhixiong. Entity Alignment Method for Different Knowledge Repositories with Joint Semantic Representation[J]. 数据分析与知识发现, 2021, 5(7): 1-9.
[5] Wang Hao, Lin Kerou, Meng Zhen, Li Xinlei. Identifying Multi-Type Entities in Legal Judgments with Text Representation and Feature Generation[J]. 数据分析与知识发现, 2021, 5(7): 10-25.
[6] Yu Xuehan, He Lin, Xu Jian. Extracting Events from Ancient Books Based on RoBERTa-CRF[J]. 数据分析与知识发现, 2021, 5(7): 26-35.
[7] Lu Quan, He Chao, Chen Jing, Tian Min, Liu Ting. A Multi-Label Classification Model with Two-Stage Transfer Learning[J]. 数据分析与知识发现, 2021, 5(7): 91-100.
[8] Liu Wenbin, He Yanqing, Wu Zhenfeng, Dong Cheng. Sentence Alignment Method Based on BERT and Multi-similarity Fusion[J]. 数据分析与知识发现, 2021, 5(7): 48-58.
[9] Yin Pengbo,Pan Weimin,Zhang Haijun,Chen Degang. Identifying Clickbait with BERT-BiGA Model[J]. 数据分析与知识发现, 2021, 5(6): 126-134.
[10] Song Ruoxuan,Qian Li,Du Yu. Identifying Academic Creative Concept Topics Based on Future Work of Scientific Papers[J]. 数据分析与知识发现, 2021, 5(5): 10-20.
[11] Hu Haotian,Ji Jinfeng,Wang Dongbo,Deng Sanhong. An Integrated Platform for Food Safety Incident Entities Based on Deep Learning[J]. 数据分析与知识发现, 2021, 5(3): 12-24.
[12] Wang Qian,Wang Dongbo,Li Bin,Xu Chao. Deep Learning Based Automatic Sentence Segmentation and Punctuation Model for Massive Classical Chinese Literature[J]. 数据分析与知识发现, 2021, 5(3): 25-34.
[13] Chang Chengyang,Wang Xiaodong,Zhang Shenglei. Polarity Analysis of Dynamic Political Sentiments from Tweets with Deep Learning Method[J]. 数据分析与知识发现, 2021, 5(3): 121-131.
[14] Zhou Wenyuan, Wang Mingyang, Jing Yu. Automatic Classification of Citation Sentiment and Purposes with AttentionSBGMC Model[J]. 数据分析与知识发现, 2021, 5(12): 48-59.
[15] Dong Miao, Su Zhongqi, Zhou Xiaobei, Lan Xue, Cui Zhigang, Cui Lei. Improving PubMedBERT for CID-Entity-Relation Classification Using Text-CNN[J]. 数据分析与知识发现, 2021, 5(11): 145-152.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938   E-mail:jishu@mail.las.ac.cn