Please wait a minute...
Data Analysis and Knowledge Discovery  2020, Vol. 4 Issue (4): 91-99    DOI: 10.11925/infotech.2096-3467.2019.0828
Current Issue | Archive | Adv Search |
Identifying Chinese / English Metaphors with Word Embedding and Recurrent Neural Network
Su Chuandong,Huang Xiaoxi(),Wang Rongbo,Chen Zhiqun,Mao Junyu,Zhu Jiaying,Pan Yuhao
Institute of Cognitive and Intelligent Computing, Hangzhou Dianzi University, Hangzhou 310018, China
Download: PDF (825 KB)   HTML ( 4
Export: BibTeX | EndNote (RIS)      

[Objective] This paper proposes a method to recognize Chinese and English metaphors with word vector combination and recurrent neural network (RNN), aiming to identify the ubiquitous metaphors from natural languages. [Methods] First, we mapped texts to the word vectors as inputs of the neural network with the help of word-embedding combination algorithm. Then, we used the RNN as encoder, and took the attention mechanism and the pooling technique as feature extractor. Finally, we utilized Softmax to calculate the probability of the text was a metaphor. [Results] The accuracy and F1 of the proposed method with English texts improved by 11.8% and 6.3%, compared with traditional method based on vanilla word embedding. For Chinese tasks, the accuracy and F1 of the proposed method also improved by 8.9% and 7.8%. [Limitations] Due to the long-distance dependence issue, our method could not effectively recognize metaphors in long texts with complex sentences. [Conclusions] The proposed model signifcantly improves the neural network’s ability to recognize metaphors.

Key wordsMetaphor Recognition      Deep Learning      Word Embedding      Recurrent Neural Network     
Received: 12 July 2019      Published: 01 June 2020
ZTFLH:  TP391  
Corresponding Authors: Huang Xiaoxi     E-mail:

Cite this article:

Su Chuandong,Huang Xiaoxi,Wang Rongbo,Chen Zhiqun,Mao Junyu,Zhu Jiaying,Pan Yuhao. Identifying Chinese / English Metaphors with Word Embedding and Recurrent Neural Network. Data Analysis and Knowledge Discovery, 2020, 4(4): 91-99.

URL:     OR

Schematic Diagram of Word Embedding Vector Operation
Metaphor Interpretation with Word Embeddings
Schematic Diagram of Word Embedding Combination
Architecture of Metaphor Recognizer Based on RNN
模型 词嵌入方式 数据集 准确率 精确率 召回率 F1值
MR G TroFi 65.3 56.1 88.5 67.7
MR F TroFi 62.0 58.5 83.6 67.3
MR P TroFi 63.6 54.0 85.9 66.3
EC-MR M1(G,P) TroFi 73.8 66.3 75.6 70.7
EC-MR M1(G,F,P) TroFi 71.1 63.3 73.1 67.9
EC-MR M2(G,P) TroFi 66.8 56.9 84.6 68.0
EC-MR M2(G,F,P) TroFi 64.2 54.3 88.6 67.3
SEQ - TroFi 73.7 68.7 76.4 72.0
MR G AN 84.2 76.0 95.0 84.3
MR F AN 80.9 79.5 77.5 78.5
MR P AN 83.1 76.6 90.0 82.8
EC-RNN M1(G,P) AN 83.1 77.8 87.5 82.4
EC-RNN M1(G,F,P) AN 82.0 74.0 92.5 82.2
EC-RNN M2(G,P) AN 84.3 75.0 97.5 84.8
EC-RNN M2(G,F,P) AN 86.5 86.8 82.5 84.6
SSN - AN 82.9 90.3 73.8 81.1
Performance of Metaphor Recognizer in English Task (%)
模型 词嵌入方式 数据集 准确率 精确率 召回率 F1
MR B TroFi_CN 58.3 50.0 96.1 65.8
MR S TroFi_CN 59.9 51.6 85.9 64.1
MR W TroFi_CN 58.8 50.3 91.4 66.4
EC-MR M1(B,S) TroFi_CN 61.5 52.1 94.9 67.3
EC-MR M1(B,S,W) TroFi_CN 61.0 51.7 96.2 67.2
EC-MR M2(B,S) TroFi_CN 59.9 50.1 93.6 66.1
EC-MR M2(B,S,W) TroFi_CN 59.4 50.7 92.3 65.5
MR B AN_CN 84.3 82.5 82.4 82.5
MR S AN_CN 77.5 71.7 82.4 76.7
MR W AN_CN 84.3 82.6 84.5 83.3
EC-MR M1(B,W) AN_CN 85.4 84.6 82.5 83.5
EC-MR M1(B,S,W) AN_CN 85.4 86.5 80.0 83.1
EC-MR M2(B,W) AN_CN 85.4 82.9 85.0 84.0
EC-MR M2(B,S,W) AN_CN 86.4 86.8 82.6 84.5
Performance of Metaphor Recognizer in Chinese Task (%)
[1] Lakoff G, Johnson M . Conceptual Metaphor in Everyday Language[J]. The Journal of Philosophy, 1980,77(8):453-486.
doi: 10.2307/2025464
[2] Tsvetkov Y, Boytsov L, Gershman A, et al. Metaphor Detection with Cross-Lingual Model Transfer[C]// Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. 2014,1:248-258.
[3] McArthur T, Lam-McArthur J, Fontaine L . Oxford Companion to the English Language[M]. Oxford University Press, 2018.
[4] Shutova E . Design and Evaluation of Metaphor Processing Systems[J]. Computational Linguistics, 2015,41(4):579-623.
doi: 10.1162/COLI_a_00233
[5] Mao R, Lin C, Guerin F. Word Embedding and WordNet Based Metaphor Identification and Interpretation[C]// Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics. 2018: 1222-1231.
[6] Wilks Y . A Preferential , Pattern-seeking, Semantics for Natural Language Inference[A]// Ahmad K, Brewster C, Stevenson M. Words and Intelligence I[M]. Springer, 2007: 83-102.
[7] Fass D . Met*: A Method for Discriminating Metonymy and Metaphor by Computer[J]. Computational Linguistics, 1991,17(1):49-90.
[8] Neuman Y, Assaf D, Cohen Y , et al. Metaphor Identification in Large Texts Corpora[J]. PLoS One, 2013,8(4):e62343.
doi: 10.1371/journal.pone.0062343
[9] Shutova E, Sun L, Korgonen A. Metaphor Identification Using Verb and Nouns Clustering[C]// Proceedings of the 23rd International Conference on Computational Linguistics. 2010: 1002-1010.
[10] Hovy D, Srivastava S, Jauhar S K, et al. Identifying Metaphorical Word Use with Tree Kernels[C]// Proceedings of the 1st Workshop on Metaphor in NLP. 2013: 52-57.
[11] Rai S, Chakraverty S, Tayal D K. Supervised Metaphor Detection Using Conditional Random Fields[C]// Proceedings of the 4th Workshop on Metaphor in NLP. 2016: 18-27.
[12] Kalchbrenner N, Grefenstette E, Blunsom P . A Convolutional Neural Network for Modelling Sentences[OL]. arXiv Preprint, arXiv: 1404. 2188.
[13] Graves A, Schmidhuber J . Framewise Phoneme Classification with Bidirectional LSTM and Other Neural Network Architectures[J]. Neural Networks, 2005,18(5-6):602-610.
doi: 10.1016/j.neunet.2005.06.042
[14] Do Dinh E L, Gurevych I. Token-Level Metaphor Detection Using Neural Networks[C]// Proceedings of the 4th Workshop on Metaphor in NLP. 2016: 28-33.
[15] Bizzoni Y, Chatzikyriakidis S, Ghanimifard M. “Deep” Learning: Detecting Metaphoricity in Adjective-Noun Pairs[C]// Proceedings of the 2017 Workshop on Stylistic Variation. 2017: 43-52.
[16] Rei M, Bulat L, Kiela D , et al. Grasping the Finer Point: A Supervised Similarity Network for Metaphor Detection[OL]. arXiv Preprint, arXiv: 1709. 00575.
[17] 王治敏, 王厚峰, 俞士汶 . 基于机器学习方法的汉语名词隐喻识别[J]. 高技术通讯 , 2006,17(6):575-580.
[17] ( Wang Zhimin, Wang Houfeng, Yu Shiwen . Chinese Nominal Metaphor Recognition Based on Machine Learning[J]. Chinese High Technology Letters, 2006,17(6):575-580.)
[18] 李斌, 于丽丽, 石民 , 等. “像”的明喻计算[J]. 中文信息学报, 2008,22(6):27-32.
[18] ( Li Bin, Yu Lili, Shi Min , et al. Computation of Chinese Simile with “Xiang”[J]. Journal of Chinese Information Processing, 2008,22(6):27-32.)
[19] 黄孝喜 . 隐喻机器理解的若干关键问题研究[D]. 杭州: 浙江大学, 2009.
[19] ( Huang Xiaoxi . Research on Some Key Issues of Metaphor Computation[D]. Hangzhou: Zhejiang University, 2009.)
[20] 黄孝喜, 李晗雨, 王荣波 , 等. 基于卷积神经网络与 SVM 分类器的隐喻识别[J]. 数据分析与知识发现, 2018,2(10):77-83.
[20] ( Huang Xiaoxi, Li Hanyu, Wang Rongbo , et al. Recognizing Metaphor with Convolution Neural Network and SVM[J]. Data Analysis and Knowledge Discovery, 2018,2(10):77-83.)
[21] Mikolov T, Chen K, Corrado G , et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv: 1301. 3781.
[22] Mikolov T, Sutskever I, Chen K, et al. Distributed Representations of Words and Phrases and Their Compositionality[C]// Proceedings of the 26th International Conference on Neural Information Processing Systems. 2013: 3111-3119.
[23] Coates J, Bollegala D . Frustratingly Easy Meta-Embedding—Computing Meta-Embeddings by Averaging Source Word Embeddings[OL]. arXiv Preprint, arXiv: 1804. 05262.
[24] Graves A, Fernández S, Schmidhuber J. Bidirectional LSTM Networks for Improved Phoneme Classification and Recognition[C]// Proceedings of the 15th International Conference on Artificial Neural Networks: Formal Models and Their Applications. Springer, 2005: 799-804.
[25] Bahdanau D, Cho K, Bengio Y . Neural Machine Translation by Jointly Learning to Align and Translate[OL]. arXiv Preprint, arXiv: 1409. 0473.
[26] Birke J, Sarkar A. A Clustering Approach for Nearly Unsupervised Recognition of Nonliteral Language[C]// Proceedings of the 11th Conference of the European Chapter of the Association for Computational Linguistics. 2006.
[27] Pennington J, Socher R, Manning C. GloVe: Global Vectors for Word Representation[C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. 2014: 1532-1543.
[28] Joulin A, Grave E, Bojanowski P , et al. Bag of Tricks for Efficient Text Classification[OL]. arXiv Preprint, arXiv: 1607. 01759.
[29] Wieting J, Bansal M, Gimpel K , et al. From Paraphrase Database to Compositional Paraphrase Model and Back[J]. Transactions of the Association for Computational Linguistics, 2015,3:345-358.
[30] Gao G, Choi E, Choi Y , et al. Neural Metaphor Detection in Context[OL]. arXiv Preprint, arXiv: 1808. 09653.
[31] Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 5998-6008.
[32] Radford A, Narasimhan K, Salimans T , et al. Improving Language Understanding by Generative Pre-training[OL].
[1] Wang Hanxue,Cui Wenjuan,Zhou Yuanchun,Du Yi. Identifying Pathogens of Foodborne Diseases with Machine Learning[J]. 数据分析与知识发现, 2021, 5(9): 54-62.
[2] Zhou Zeyu,Wang Hao,Zhao Zibo,Li Yueyan,Zhang Xiaoqin. Construction and Application of GCN Model for Text Classification with Associated Information[J]. 数据分析与知识发现, 2021, 5(9): 31-41.
[3] Zhao Danning,Mu Dongmei,Bai Sen. Automatically Extracting Structural Elements of Sci-Tech Literature Abstracts Based on Deep Learning[J]. 数据分析与知识发现, 2021, 5(7): 70-80.
[4] Xu Yuemei, Wang Zihou, Wu Zixin. Predicting Stock Trends with CNN-BiLSTM Based Multi-Feature Integration Model[J]. 数据分析与知识发现, 2021, 5(7): 126-138.
[5] Huang Mingxuan,Jiang Caoqing,Lu Shoudong. Expanding Queries Based on Word Embedding and Expansion Terms[J]. 数据分析与知识发现, 2021, 5(6): 115-125.
[6] Zhong Jiawa,Liu Wei,Wang Sili,Yang Heng. Review of Methods and Applications of Text Sentiment Analysis[J]. 数据分析与知识发现, 2021, 5(6): 1-13.
[7] Zhang Guobiao,Li Jie. Detecting Social Media Fake News with Semantic Consistency Between Multi-model Contents[J]. 数据分析与知识发现, 2021, 5(5): 21-29.
[8] Chang Chengyang,Wang Xiaodong,Zhang Shenglei. Polarity Analysis of Dynamic Political Sentiments from Tweets with Deep Learning Method[J]. 数据分析与知识发现, 2021, 5(3): 121-131.
[9] Feng Yong,Liu Yang,Xu Hongyan,Wang Rongbing,Zhang Yonggang. Recommendation Model Incorporating Neighbor Reviews for GRU Products[J]. 数据分析与知识发现, 2021, 5(3): 78-87.
[10] Cheng Bin,Shi Shuicai,Du Yuncheng,Xiao Shibin. Keyword Extraction for Journals Based on Part-of-Speech and BiLSTM-CRF Combined Model[J]. 数据分析与知识发现, 2021, 5(3): 101-108.
[11] Hu Haotian,Ji Jinfeng,Wang Dongbo,Deng Sanhong. An Integrated Platform for Food Safety Incident Entities Based on Deep Learning[J]. 数据分析与知识发现, 2021, 5(3): 12-24.
[12] Zhang Qi,Jiang Chuan,Ji Youshu,Feng Minxuan,Li Bin,Xu Chao,Liu Liu. Unified Model for Word Segmentation and POS Tagging of Multi-Domain Pre-Qin Literature[J]. 数据分析与知识发现, 2021, 5(3): 2-11.
[13] Shen Si,Li Qinyu,Ye Yuan,Sun Hao,Ye Wenhao. Topic Mining and Evolution Analysis of Medical Sci-Tech Reports with TWE Model[J]. 数据分析与知识发现, 2021, 5(3): 35-44.
[14] Lv Xueqiang,Luo Yixiong,Li Jiaquan,You Xindong. Review of Studies on Detecting Chinese Patent Infringements[J]. 数据分析与知识发现, 2021, 5(3): 60-68.
[15] Li Danyang, Gan Mingxin. Music Recommendation Method Based on Multi-Source Information Fusion[J]. 数据分析与知识发现, 2021, 5(2): 94-105.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938