自然语言处理中的注意力机制研究综述*
|
石磊,王毅,成颖,魏瑞斌
|
Review of Attention Mechanism in Natural Language Processing
|
Shi Lei,Wang Yi,Cheng Ying,Wei Ruibin
|
|
表2 部分方面情感分析模型的表现
|
Table 2 The Performance of Aspect-Level Sentiment Analysis Models
|
|
作者 | 模型 | 情感极性准确率(%) | 注意力 | Restaurant | Laptop | Twitter | Wang等[32] | LSTM | 74.3 | 66.5 | 66.5 | 无 | Tang等[33] | TD-LSTM | 75.6 | 68.1 | 70.8 | 语境化注意力 | Wang等[32] | ATAE-LSTM | 77.2 | 68.7 | - | 方面嵌入注意力 | Ma等[21] | IAN | 78.6 | 72.1 | - | 粗粒度交互注意力 | Liu等[34] | BiLSTM-ATT-G | 79.7 | 73.1 | 70.4 | 语境化注意力 | Huang等[35] | AOA-LSTM | 81.2 | 74.5 | - | 细粒度双向注意力 | Fan等[36] | MGAN | 81.2 | 75.4 | 72.5 | 多粒度双向注意力 | Zheng等[37] | LCR-Rot | 81.3 | 75.2 | 72.7 | 语境化粗粒度双向注意力 | Li等[38] | HAPN | 82.2 | 77.3 | - | 层级注意力 | Song等[39] | AEN-BERT | 83.1 | 80.0 | 74.7 | 多头自注意力网络 |
|
|
|