Please wait a minute...
Data Analysis and Knowledge Discovery  2020, Vol. 4 Issue (4): 100-108    DOI: 10.11925/infotech.2096-3467.2019.0896
Current Issue | Archive | Adv Search |
Identifying Noun Metaphors with Transformer and BERT
Zhang Dongyu1,Cui Zijuan2,Li Yingxia1,Zhang Wei1,Lin Hongfei3()
1 School of Software, Dalian University of Technology, Dalian 116620, China
2 International Office, Dalian University of Technology, Dalian 116024, China
3 School of Computer Science and Technology, Dalian University of Technology, Dalian 116023, China
Download: PDF(763 KB)   HTML ( 8
Export: BibTeX | EndNote (RIS)      

[Objective] This paper proposes a new method to address the issues facing semantic information and relationship representation, aiming to improve the recognition of noun metaphors. [Methods] First, we used the BERT model to replace the word vector, and added position relationship among words for the semantic representation. Then, we utilized the Transformer model to extract features. Finally, we identified the noun metaphors with the help of used neural network classifier. [Results] The proposed model got the highest scores in accuracy (0.900 0), precision (0.896 4), recall (0.885 8), and F1(0.891 0). It covered multiple key points to improve the classification results of noun metaphors. [Limitations] The proposed method could not process the Chinese ancient idioms, as well as rare or dummy vocabularies. [Conclusions] The proposed model could more effectively identify Noun Metaphors than the existing models based on artificial features and deep learnings.

Key wordsMetaphor Recognition      Noun Metaphor      Semantic Comprehension      Transformer Model      BERT     
Received: 30 July 2019      Published: 01 June 2020
ZTFLH:  TP391  
Corresponding Authors: Lin Hongfei     E-mail:

Cite this article:

Zhang Dongyu,Cui Zijuan,Li Yingxia,Zhang Wei,Lin Hongfei. Identifying Noun Metaphors with Transformer and BERT. Data Analysis and Knowledge Discovery, 2020, 4(4): 100-108.

URL:     OR

Noun Metaphor Identification Process of BERT+Transformer Model
Training Process of BERT Model
Structure of Transformer Model
类别 数量 比例
动词隐喻 2 040 46.43%
名词隐喻 2 035 46.31%
非隐喻 319 7.26%
总计 4 394 100%
Data Set Composition
类别 示例
动词隐喻 知了在树上唱歌
名词隐喻 他像孔雀一样高傲
非隐喻 对任何不屈服于美国的国家实行制裁
Sample Data Set

True False
True Tp Fn
False Fp Tn
Character Meaning in Confusion Matrix
模型 Acc P R F1
CNN 0.870 9 0.879 6 0.834 6 0.856 5
LSTM 0.843 6 0.850 0 0.803 1 0.825 9
NN 0.746 7 0.742 8 0.743 1 0.747 8
LSTM+ATT 0.850 9 0.870 6 0.795 2 0.831 2
DBi-LSTM 0.744 8 0.743 0 0.743 8 0.744 5
CNN+SVM 0.784 0 0.781 2 0.780 2 0.784 6
Capsule 0.878 1 0.875 5 0.858 2 0.866 7
Transformer 0.856 3 0.895 9 0.779 5 0.833 6
BERT 0.883 6 0.874 0 0.874 0 0.874 0
BERT+Transformer 0.900 0 0.896 4 0.885 8 0.891 0
Results of Noun Metaphor Identification
[1] Lakoff G, Johnson M. Metaphors We Live by[M]. University of Chicago Press, 2008.
[2] Richards I A. The Philosophy of Rhetoric[M]. New York: Oxford University Press, 1965.
[3] Ausubel D P . The Acquisition and Retention of Knowledge: A Cognitive View[M]. Springer Science & Business Media, 2012.
[4] 田嘉, 苏畅, 陈怡疆 . 隐喻计算研究进展[J]. 软件学报, 2015,26(1):40-51.
[4] ( Tian Jia, Su Chang, Chen Yijiang . Computational Metaphor Processing[J]. Journal of Software, 2015,26(1):40-51.)
[5] Brunner G, Liu Y, Pascual D , et al. On the Validity of Self-Attention as Explanation in Transformer Models[OL]. arXiv Preprint, arXiv: 1908. 04211.
[6] Khandelwal U, Clark K, Jurafsky D , et al. Sample Efficient Text Summarization Using a Single Pre-Trained Transformer[OL]. arXiv Preprint, arXiv: 1905. 08836.
[7] Liu J, Cohen S B, Lapata M. Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model[C]// Proceedings of the 2019 IWCS Shared Task on Semantic Parsing. 2019.
[8] Yang W, Xie Y, Lin A , et al. End-to-End Open-Domain Question Answering with Bertserini[OL]. arXiv Preprint, arXiv: 1902. 01718.
[9] Alberti C, Lee K, Collins M . A Bert Baseline for the Natural Questions[OL]. arXiv Preprint, arXiv: 1901. 08634.
[10] Xu P, Ma X, Nallapati R , et al. Passage Ranking with Weak Supervision[OL]. arXiv Preprint, arXiv: 1905. 05910.
[11] Fass D . Met: A Method for Discriminating Metonymy and Metaphor by Computer[J]. Computational Linguistics, 1991,17(1):49-90.
[12] Wilks Y, Dalton A, Allen J, et al. Automatic Metaphor Detection Using Large-Scale Lexical Resources and Conventional Metaphor Extraction[C]// Proceedings of the 1st Workshop on Metaphor in NLP, Atlanta, Georgia, USA. 2013: 36-44.
[13] Jang H, Moon S, Jo Y, et al. Metaphor Detection in Discourse[C]// Proceedings of the 16th Annual Meeting of the Special Interest Group on Discourse and Dialogue, New York, USA. 2015: 384-392.
[14] Rai S, Chakraverty S, Tayal D K , et al. A Study on Impact of Context on Metaphor Detection[J]. The Computer Journal, 2018,61(11):1667-1682.
[15] Do Dinh E L, Gurevych I. Token-level Metaphor Detection Using Neural Networks[C]// Proceedings of the 4th Workshop on Metaphor in NLP, San Diego, California, USA. 2016: 28-33.
[16] Sun S, Xie Z. BiLSTM-Based Models for Metaphor Detection[C]// Proceedings of the 2017 National CCF Conference on Natural Language Processing and Chinese Computing (NLPCC2017), Dalian, China. 2017: 431-442.
[17] 汪梦翔, 饶琪, 顾澄 , 等. 汉语名词的隐喻知识表示及获取研究[J]. 中文信息学报, 2017,31(6):1-9.
[17] ( Wang Mengxiang, Rao Qi, Gu Cheng , et al. Metaphorical Knowledge Expression and Acquisition for Chinese Nouns[J]. Journal of Chinese Information Processing, 2017,31(6):1-9.)
[18] Bizzoni Y, Ghanimifard M. Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor Detection[C]// Proceedings of the 2018 Workshop on Figurative Language Processing, Louisiana, USA. 2018: 91-101.
[19] Gao G, Choi E, Choi Y , et al. Neural Metaphor Detection in Context[OL]. arXiv Preprint, arXiv: 1808. 09653.
[20] Devlin J, Chang M W, Lee K , et al. Bert: Pre-training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv: 1810. 04805.
[21] Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 5998-6008.
[22] Moriya S, Shibata C. Transfer Learning Method for Very Deep CNN for Text Classification and Methods for Its Evaluation[C]// Proceedings of the IEEE 42nd Annual Computer Software & Applications Conference. 2018: 153-158.
[23] Mikolov T, Chen K, Corrado G , et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv: 1301. 3781.
[24] Li C, Zhan G, Li Z. News Text Classification Based on Improved Bi-LSTM-CNN[C]// Proceedings of the 9th International Conference on Information Technology in Medicine and Education (ITME). IEEE, 2018: 890-893.
[25] Maldonado S, López J . Dealing with High-Dimensional Class-Imbalanced Datasets: Embedded Feature Selection for SVM Classification[J]. Applied Soft Computing, 2018,67:94-105.
doi: 10.1016/j.asoc.2018.02.051
[26] Hinton G E, Krizhevsky A, Wang S D. Transforming Auto-encoders[C]// Proceedings of the 21st International Conference on Artificial Neural Networks. Springer, 2011: 44-51.
[27] Kingma D P, Ba J . Adam: A Method for Stochastic Optimization[OL]. arXiv Preprint, arXiv: 1412. 6980.
[1] Zhao Ping,Sun Lianying,Tu Shuai,Bian Jianling,Wan Ying. Identifying Scenic Spot Entities Based on Improved Knowledge Transfer[J]. 数据分析与知识发现, 2020, 4(5): 118-126.
[2] Su Chuandong,Huang Xiaoxi,Wang Rongbo,Chen Zhiqun,Mao Junyu,Zhu Jiaying,Pan Yuhao. Identifying Chinese / English Metaphors with Word Embedding and Recurrent Neural Network[J]. 数据分析与知识发现, 2020, 4(4): 91-99.
[3] Meishan Chen,Chenxi Xia. Identifying Entities of Online Questions from Cancer Patients Based on Transfer Learning[J]. 数据分析与知识发现, 2019, 3(12): 61-69.
[4] Xiaoxi Huang,Hanyu Li,Rongbo Wang,Xiaohua Wang,Zhiqun Chen. Recognizing Metaphor with Convolution Neural Network and SVM[J]. 数据分析与知识发现, 2018, 2(10): 77-83.
[5] Xia Lixin, Cai Xin, Shi Yijin, Sun Danxia, Wang Zhongyi. Organization and Visualization of Web Life Service Information Research[J]. 现代图书情报技术, 2014, 30(4): 85-91.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938