Please wait a minute...
Data Analysis and Knowledge Discovery  2022, Vol. 6 Issue (4): 120-129    DOI: 10.11925/infotech.2096-3467.2021.0884
Current Issue | Archive | Adv Search |
Identifying Metaphor with Transformer and Graph Convolutional Network
Guo Fanrong,Huang Xiaoxi(),Wang Rongbo,Chen Zhiqun,Hu Chuang,Xie Yimin,Si Boyu
Institute of Cognitive and Intelligent Computing, Hangzhou Dianzi University, Hangzhou 310018, China
Download: PDF (864 KB)   HTML ( 40
Export: BibTeX | EndNote (RIS)      
Abstract  

[Objective] This paper proposes a metaphor identification model based on graph convolutional neural network and Transformer, aiming to effectively find metaphor expressions with multiple words. [Methods] We used the graph convolutional neural network to extract the structure information from the syntactic dependency tree. Then, we combined the structure with deep semantic representation by the Transformer. Finally, we calculated the probability of metaphorical expression for the target words through SoftMax. [Results] Compared with the existing algorithms, the F1 values of our model increased by 1.9% and 1.7% on UVA VERB and UVA ALL POS datasets. The F1 values were also improved by 1.1% and 1.9% on TOEFL VERB and TOEFL ALL POS. The F1 value increased by 1.2% on the Chinese data CCL. [Limitations] If there is ambiguity or ambiguous referential information in the sentence, our model will not effectively identify the metaphor expressions. [Conclusions] Graph convolutional network and syntactic dependency tree can enrich the semantics of target words, which improves the recognition of single and multi-word metaphors.

Key wordsMetaphor Identification      Graph Convolutional Neural Network      Syntactic Dependency      Transformer     
Received: 23 August 2021      Published: 12 May 2022
ZTFLH:  TP391  
Fund:Humanities and Social Sciences Research Program Funds from Ministry of Education of China(18YJA740016);National Social Science Fund of China(18ZDA290)
Corresponding Authors: Huang Xiaoxi,ORCID:0000-0003-4483-3664     E-mail: huangxx@hdu.edu.cn

Cite this article:

Guo Fanrong, Huang Xiaoxi, Wang Rongbo, Chen Zhiqun, Hu Chuang, Xie Yimin, Si Boyu. Identifying Metaphor with Transformer and Graph Convolutional Network. Data Analysis and Knowledge Discovery, 2022, 6(4): 120-129.

URL:

https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/10.11925/infotech.2096-3467.2021.0884     OR     https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/Y2022/V6/I4/120

Metaphor Recognition Model Based on Transformer and Graph Convolutional Neural Network
Dependency Structure
实验超参数 参数值
句子长度 128
RoBERTa Layers 12
RoBERTa Attention Heads 12
Mutil-Head Attention Heads 8
Batch Size 16
迭代次数 3
窗口范围(k) 2
图卷积神经网络的层数 2
第一层图卷积神经网络的隐藏层维度 512
第二层图卷积神经网络的隐藏层维度 256
Hyperparameters Setting
方法 数据集 P R F1
VUA ALLPOS 0.608 0.700 0.651
VUA VERB 0.600 0.763 0.672
VUA ALLPOS 0.716 0.736 0.726
VUA VERB 0.682 0.713 0.697
VUA ALLPOS 0.730 0.757 0.743
VUA VERB 0.693 0.723 0.708
VUA ALLPOS 0.746 0.715 0.730
VUA VERB 0.761 0.781 0.771
VUA ALLPOS 0.756 0.783 0.769
VUA VERB 0.789 0.819 0.804
本文 VUA ALLPOS 0.777 0.795 0.786
VUA VERB 0.812 0.834 0.823
Performance of Different Models on VUA Dataset
方法 数据集 P R F1
TOEFL ALLPOS 0.709 0.697 0.703
TOEFL VERB 0.731 0.707 0.719
TOEFL ALLPOS 0.695 0.735 0.715
TOEFL VERB 0.733 0.766 0.749
本文 TOEFL ALLPOS 0.725 0.742 0.734
TOEFL VERB 0.760 0.764 0.762
Performance of Different Models on TOEFL Dataset
方法 P R F1
hqu - - 0.833
faun - - 0.831
YNU-HPCC - - 0.831
MITLAB - - 0.827
prism - - 0.821
0.881 0.896 0.888
本文 0.889 0.912 0.900
Performance of Different Models on CCL Dataset
模型 数据集 P R F1
Ours VUA ALLPOS 0.777 0.795 0.786
VUA VERB 0.812 0.834 0.823
-LOCAL VUA ALLPOS 0.761 0.787 0.773
VUA VERB 0.805 0.812 0.808
-GCN VUA ALLPOS 0.740 0.767 0.753
VUA VERB 0.764 0.813 0.785
Results of Ablation Experiments
Influence of Convolutional Neural Network Hidden Layer Parameter
[1] Aggarwal S, Singh R. Metaphor Detection Using Deep Contextualized Word Embeddings[OL]. arXiv Preprint, arXiv:2009.12565.
[2] Wu C H, Wu F Z, Chen Y B, et al. Neural Metaphor Detecting with CNN-LSTM Model[C]//Proceedings of the 2018 Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018: 110-114.
[3] Dankers V, Malhotra K, Kudva G, et al. Being Neighbourly: Neural Metaphor Identification in Discourse[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 227-234.
[4] Gong H, Bhat S P, Viswanath P. Geometry of Compositionality[OL]. arXiv Preprint, arXiv: 1611.09799.
[5] Leong C W B, Klebanov B B, Shutova E. A Report on the 2018 VUA Metaphor Detection Shared Task[C]//Proceedings of the 2018 Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018: 56-66.
[6] Neuman Y, Assaf D, Cohen Y, et al. Metaphor Identification in Large Texts Corpora[J]. PLoS One, 2013, 8(4):e62343.
doi: 10.1371/journal.pone.0062343
[7] Shutova E, Lin S, Korhonen A. Metaphor Identification Using Verb and Noun Clustering[C]//Proceedings of the 23rd International Conference on Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2010: 1002-1010.
[8] Klebanov B B, Leong C W, Flor M. Supervised Word-Level Metaphor Detection: Experiments with Concreteness and Reweighting of Examples[C]//Proceedings of the 3rd Workshop on Metaphor in NLP. Stroudsburg, PA, USA: Association for Computational Linguistics, 2015: 11-20.
[9] Gao G, Choi E, Choi Y, et al. Neural Metaphor Detection in Context[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018: 607-613.
[10] Mao R, Lin C H, Guerin F. End-to-End Sequential Metaphor Identification Inspired by Linguistic Theories[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019: 3888-3898.
[11] Group P. MIP: A Method for Identifying Metaphorically Used Words in Discourse[J]. Metaphor and Symbol, 2007, 22(1):1-39.
doi: 10.1080/10926480709336752
[12] Wilks Y, Fass D. The Preference Semantics Family[J]. Computers & Mathematics with Applications, 1992, 23(2-5):205-221.
doi: 10.1016/0898-1221(92)90141-4
[13] Pennington J, Socher R, Manning C. GloVe: Global Vectors for Word Representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2014: 1532-1543.
[14] Peters M E, Neumann M, Iyyer M, et al. Deep Contextualized Word Representations[OL]. arXiv Preprint, arXiv:1802.05365.
[15] 苏传东, 黄孝喜, 王荣波, 等. 基于词嵌入融合和循环神经网络的中英文隐喻识别[J]. 数据分析与知识发现, 2020, 4(4):91-99.
[15] ( Su Chuandong, Huang Xiaoxi, Wang Rongbo, et al. Identifying Chinese/English Metaphors with Word Embedding and Recurrent Neural Network[J]. Data Analysis and Knowledge Discovery, 2020, 4(4):91-99.)
[16] Devlin J, Chang M, Lee K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv:1810.04805.
[17] 张冬瑜, 崔紫娟, 李映夏, 等. 基于Transformer和BERT的名词隐喻识别[J]. 数据分析与知识发现, 2020, 4(4):100-108.
[17] ( Zhang Dongyu, Cui Zijuan, Li Yingxia, et al. Identifying Noun Metaphors with Transformer and BERT[J]. Data Analysis and Knowledge Discovery, 2020, 4(4):100-108.)
[18] Li S Q, Zeng J J, Zhang J H, et al. ALBERT-BiLSTM for Sequential Metaphor Detection[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 110-115.
[19] Gong H Y, Gupta K, Jain A, et al. IlliniMet: Illinois System for Metaphor Detection with Contextual and Linguistic Information[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 146-153.
[20] Liu Y H, Ott M, Goyal N, et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach[OL]. arXiv Preprint, arXiv: 1907. 11692.
[21] Liu X, Luo Z C, Huang H Y. Jointly Multiple Events Extraction via Attention-based Graph Information Aggregation[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2020:1247-1256.
[22] Nguyen T, Grishman R. Graph Convolutional Networks with Argument-Aware Pooling for Event Detection[C]//Proceedings of the 2019 AAAI Conference on Artificial Intelligence. 2019:7370-7377.
[23] Song L, Wang Z, Yu M, et al. Exploring Graph-Structured Passage Representation for Multi-Hop Reading Comprehension with Graph Neural Networks[OL]. arXiv Preprint, arXiv: 1809. 02040.
[24] Su C D, Fukumoto F, Huang X X, et al. DeepMet: A Reading Comprehension Paradigm for Token-Level Metaphor Detection[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 30-39.
[25] Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]//Proceedings of the 31st Conference on Neural Information Processing Systems. 2017: 5998-6008.
[26] Leong C W B, Klebanov B B, Hamill C, et al. A Report on the 2020 VUA and TOEFL Metaphor Detection Shared Task[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 18-29.
[27] 朱嘉莹, 王荣波, 黄孝喜, 等. 基于Bi-LSTM的多层面隐喻识别方法[J]. 大连理工大学学报, 2020, 60(2):209-215.
[27] ( Zhu Jiaying, Wang Rongbo, Huang Xiaoxi, et al. Multi-Level Metaphor Detection Method Based on Bi-LSTM[J]. Journal of Dalian University of Technology, 2020, 60(2):209-215.)
[1] Zhang Dongyu,Cui Zijuan,Li Yingxia,Zhang Wei,Lin Hongfei. Identifying Noun Metaphors with Transformer and BERT[J]. 数据分析与知识发现, 2020, 4(4): 100-108.
[2] Huang Xiaoxi, Zhang Hua, Lu Bei, Wang Rongbo, Wu Ting. An Approach to Chinese Metaphor Identification Based on Word Abstractness[J]. 现代图书情报技术, 2015, 31(4): 34-40.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938   E-mail:jishu@mail.las.ac.cn