[Objective] This paper proposes a metaphor identification model based on graph convolutional neural network and Transformer, aiming to effectively find metaphor expressions with multiple words. [Methods] We used the graph convolutional neural network to extract the structure information from the syntactic dependency tree. Then, we combined the structure with deep semantic representation by the Transformer. Finally, we calculated the probability of metaphorical expression for the target words through SoftMax. [Results] Compared with the existing algorithms, the F1 values of our model increased by 1.9% and 1.7% on UVA VERB and UVA ALL POS datasets. The F1 values were also improved by 1.1% and 1.9% on TOEFL VERB and TOEFL ALL POS. The F1 value increased by 1.2% on the Chinese data CCL. [Limitations] If there is ambiguity or ambiguous referential information in the sentence, our model will not effectively identify the metaphor expressions. [Conclusions] Graph convolutional network and syntactic dependency tree can enrich the semantics of target words, which improves the recognition of single and multi-word metaphors.
Aggarwal S, Singh R. Metaphor Detection Using Deep Contextualized Word Embeddings[OL]. arXiv Preprint, arXiv:2009.12565.
[2]
Wu C H, Wu F Z, Chen Y B, et al. Neural Metaphor Detecting with CNN-LSTM Model[C]//Proceedings of the 2018 Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018: 110-114.
[3]
Dankers V, Malhotra K, Kudva G, et al. Being Neighbourly: Neural Metaphor Identification in Discourse[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 227-234.
[4]
Gong H, Bhat S P, Viswanath P. Geometry of Compositionality[OL]. arXiv Preprint, arXiv: 1611.09799.
[5]
Leong C W B, Klebanov B B, Shutova E. A Report on the 2018 VUA Metaphor Detection Shared Task[C]//Proceedings of the 2018 Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018: 56-66.
[6]
Neuman Y, Assaf D, Cohen Y, et al. Metaphor Identification in Large Texts Corpora[J]. PLoS One, 2013, 8(4):e62343.
doi: 10.1371/journal.pone.0062343
[7]
Shutova E, Lin S, Korhonen A. Metaphor Identification Using Verb and Noun Clustering[C]//Proceedings of the 23rd International Conference on Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2010: 1002-1010.
[8]
Klebanov B B, Leong C W, Flor M. Supervised Word-Level Metaphor Detection: Experiments with Concreteness and Reweighting of Examples[C]//Proceedings of the 3rd Workshop on Metaphor in NLP. Stroudsburg, PA, USA: Association for Computational Linguistics, 2015: 11-20.
[9]
Gao G, Choi E, Choi Y, et al. Neural Metaphor Detection in Context[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2018: 607-613.
[10]
Mao R, Lin C H, Guerin F. End-to-End Sequential Metaphor Identification Inspired by Linguistic Theories[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Stroudsburg, PA, USA: Association for Computational Linguistics, 2019: 3888-3898.
[11]
Group P. MIP: A Method for Identifying Metaphorically Used Words in Discourse[J]. Metaphor and Symbol, 2007, 22(1):1-39.
doi: 10.1080/10926480709336752
[12]
Wilks Y, Fass D. The Preference Semantics Family[J]. Computers & Mathematics with Applications, 1992, 23(2-5):205-221.
doi: 10.1016/0898-1221(92)90141-4
[13]
Pennington J, Socher R, Manning C. GloVe: Global Vectors for Word Representation[C]//Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP). Stroudsburg, PA, USA: Association for Computational Linguistics, 2014: 1532-1543.
[14]
Peters M E, Neumann M, Iyyer M, et al. Deep Contextualized Word Representations[OL]. arXiv Preprint, arXiv:1802.05365.
( Su Chuandong, Huang Xiaoxi, Wang Rongbo, et al. Identifying Chinese/English Metaphors with Word Embedding and Recurrent Neural Network[J]. Data Analysis and Knowledge Discovery, 2020, 4(4):91-99.)
[16]
Devlin J, Chang M, Lee K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv:1810.04805.
( Zhang Dongyu, Cui Zijuan, Li Yingxia, et al. Identifying Noun Metaphors with Transformer and BERT[J]. Data Analysis and Knowledge Discovery, 2020, 4(4):100-108.)
[18]
Li S Q, Zeng J J, Zhang J H, et al. ALBERT-BiLSTM for Sequential Metaphor Detection[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 110-115.
[19]
Gong H Y, Gupta K, Jain A, et al. IlliniMet: Illinois System for Metaphor Detection with Contextual and Linguistic Information[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 146-153.
[20]
Liu Y H, Ott M, Goyal N, et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach[OL]. arXiv Preprint, arXiv: 1907. 11692.
[21]
Liu X, Luo Z C, Huang H Y. Jointly Multiple Events Extraction via Attention-based Graph Information Aggregation[C]//Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2020:1247-1256.
[22]
Nguyen T, Grishman R. Graph Convolutional Networks with Argument-Aware Pooling for Event Detection[C]//Proceedings of the 2019 AAAI Conference on Artificial Intelligence. 2019:7370-7377.
[23]
Song L, Wang Z, Yu M, et al. Exploring Graph-Structured Passage Representation for Multi-Hop Reading Comprehension with Graph Neural Networks[OL]. arXiv Preprint, arXiv: 1809. 02040.
[24]
Su C D, Fukumoto F, Huang X X, et al. DeepMet: A Reading Comprehension Paradigm for Token-Level Metaphor Detection[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 30-39.
[25]
Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]//Proceedings of the 31st Conference on Neural Information Processing Systems. 2017: 5998-6008.
[26]
Leong C W B, Klebanov B B, Hamill C, et al. A Report on the 2020 VUA and TOEFL Metaphor Detection Shared Task[C]//Proceedings of the 2nd Workshop on Figurative Language Processing. Stroudsburg, PA, USA: Association for Computational Linguistics, 2020: 18-29.
( Zhu Jiaying, Wang Rongbo, Huang Xiaoxi, et al. Multi-Level Metaphor Detection Method Based on Bi-LSTM[J]. Journal of Dalian University of Technology, 2020, 60(2):209-215.)