Please wait a minute...
Data Analysis and Knowledge Discovery  2020, Vol. 4 Issue (4): 100-108    DOI: 10.11925/infotech.2096-3467.2019.0896
Current Issue | Archive | Adv Search |
Identifying Noun Metaphors with Transformer and BERT
Zhang Dongyu1,Cui Zijuan2,Li Yingxia1,Zhang Wei1,Lin Hongfei3()
1 School of Software, Dalian University of Technology, Dalian 116620, China
2 International Office, Dalian University of Technology, Dalian 116024, China
3 School of Computer Science and Technology, Dalian University of Technology, Dalian 116023, China
Download: PDF (763 KB)   HTML ( 21
Export: BibTeX | EndNote (RIS)      
Abstract  

[Objective] This paper proposes a new method to address the issues facing semantic information and relationship representation, aiming to improve the recognition of noun metaphors. [Methods] First, we used the BERT model to replace the word vector, and added position relationship among words for the semantic representation. Then, we utilized the Transformer model to extract features. Finally, we identified the noun metaphors with the help of used neural network classifier. [Results] The proposed model got the highest scores in accuracy (0.900 0), precision (0.896 4), recall (0.885 8), and F1(0.891 0). It covered multiple key points to improve the classification results of noun metaphors. [Limitations] The proposed method could not process the Chinese ancient idioms, as well as rare or dummy vocabularies. [Conclusions] The proposed model could more effectively identify Noun Metaphors than the existing models based on artificial features and deep learnings.

Key wordsMetaphor Recognition      Noun Metaphor      Semantic Comprehension      Transformer Model      BERT     
Received: 30 July 2019      Published: 01 June 2020
ZTFLH:  TP391  
Corresponding Authors: Lin Hongfei     E-mail: hflin@dlut.edu.cn

Cite this article:

Zhang Dongyu,Cui Zijuan,Li Yingxia,Zhang Wei,Lin Hongfei. Identifying Noun Metaphors with Transformer and BERT. Data Analysis and Knowledge Discovery, 2020, 4(4): 100-108.

URL:

https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/10.11925/infotech.2096-3467.2019.0896     OR     https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/Y2020/V4/I4/100

Noun Metaphor Identification Process of BERT+Transformer Model
Training Process of BERT Model
Structure of Transformer Model
类别 数量 比例
动词隐喻 2 040 46.43%
名词隐喻 2 035 46.31%
非隐喻 319 7.26%
总计 4 394 100%
Data Set Composition
类别 示例
动词隐喻 知了在树上唱歌
名词隐喻 他像孔雀一样高傲
非隐喻 对任何不屈服于美国的国家实行制裁
Sample Data Set
实际

预测
True False
True Tp Fn
False Fp Tn
Character Meaning in Confusion Matrix
模型 Acc P R F1
CNN 0.870 9 0.879 6 0.834 6 0.856 5
LSTM 0.843 6 0.850 0 0.803 1 0.825 9
NN 0.746 7 0.742 8 0.743 1 0.747 8
LSTM+ATT 0.850 9 0.870 6 0.795 2 0.831 2
DBi-LSTM 0.744 8 0.743 0 0.743 8 0.744 5
CNN+SVM 0.784 0 0.781 2 0.780 2 0.784 6
Capsule 0.878 1 0.875 5 0.858 2 0.866 7
Transformer 0.856 3 0.895 9 0.779 5 0.833 6
BERT 0.883 6 0.874 0 0.874 0 0.874 0
BERT+Transformer 0.900 0 0.896 4 0.885 8 0.891 0
Results of Noun Metaphor Identification
[1] Lakoff G, Johnson M. Metaphors We Live by[M]. University of Chicago Press, 2008.
[2] Richards I A. The Philosophy of Rhetoric[M]. New York: Oxford University Press, 1965.
[3] Ausubel D P . The Acquisition and Retention of Knowledge: A Cognitive View[M]. Springer Science & Business Media, 2012.
[4] 田嘉, 苏畅, 陈怡疆 . 隐喻计算研究进展[J]. 软件学报, 2015,26(1):40-51.
[4] ( Tian Jia, Su Chang, Chen Yijiang . Computational Metaphor Processing[J]. Journal of Software, 2015,26(1):40-51.)
[5] Brunner G, Liu Y, Pascual D , et al. On the Validity of Self-Attention as Explanation in Transformer Models[OL]. arXiv Preprint, arXiv: 1908. 04211.
[6] Khandelwal U, Clark K, Jurafsky D , et al. Sample Efficient Text Summarization Using a Single Pre-Trained Transformer[OL]. arXiv Preprint, arXiv: 1905. 08836.
[7] Liu J, Cohen S B, Lapata M. Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model[C]// Proceedings of the 2019 IWCS Shared Task on Semantic Parsing. 2019.
[8] Yang W, Xie Y, Lin A , et al. End-to-End Open-Domain Question Answering with Bertserini[OL]. arXiv Preprint, arXiv: 1902. 01718.
[9] Alberti C, Lee K, Collins M . A Bert Baseline for the Natural Questions[OL]. arXiv Preprint, arXiv: 1901. 08634.
[10] Xu P, Ma X, Nallapati R , et al. Passage Ranking with Weak Supervision[OL]. arXiv Preprint, arXiv: 1905. 05910.
[11] Fass D . Met: A Method for Discriminating Metonymy and Metaphor by Computer[J]. Computational Linguistics, 1991,17(1):49-90.
[12] Wilks Y, Dalton A, Allen J, et al. Automatic Metaphor Detection Using Large-Scale Lexical Resources and Conventional Metaphor Extraction[C]// Proceedings of the 1st Workshop on Metaphor in NLP, Atlanta, Georgia, USA. 2013: 36-44.
[13] Jang H, Moon S, Jo Y, et al. Metaphor Detection in Discourse[C]// Proceedings of the 16th Annual Meeting of the Special Interest Group on Discourse and Dialogue, New York, USA. 2015: 384-392.
[14] Rai S, Chakraverty S, Tayal D K , et al. A Study on Impact of Context on Metaphor Detection[J]. The Computer Journal, 2018,61(11):1667-1682.
[15] Do Dinh E L, Gurevych I. Token-level Metaphor Detection Using Neural Networks[C]// Proceedings of the 4th Workshop on Metaphor in NLP, San Diego, California, USA. 2016: 28-33.
[16] Sun S, Xie Z. BiLSTM-Based Models for Metaphor Detection[C]// Proceedings of the 2017 National CCF Conference on Natural Language Processing and Chinese Computing (NLPCC2017), Dalian, China. 2017: 431-442.
[17] 汪梦翔, 饶琪, 顾澄 , 等. 汉语名词的隐喻知识表示及获取研究[J]. 中文信息学报, 2017,31(6):1-9.
[17] ( Wang Mengxiang, Rao Qi, Gu Cheng , et al. Metaphorical Knowledge Expression and Acquisition for Chinese Nouns[J]. Journal of Chinese Information Processing, 2017,31(6):1-9.)
[18] Bizzoni Y, Ghanimifard M. Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor Detection[C]// Proceedings of the 2018 Workshop on Figurative Language Processing, Louisiana, USA. 2018: 91-101.
[19] Gao G, Choi E, Choi Y , et al. Neural Metaphor Detection in Context[OL]. arXiv Preprint, arXiv: 1808. 09653.
[20] Devlin J, Chang M W, Lee K , et al. Bert: Pre-training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv: 1810. 04805.
[21] Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 5998-6008.
[22] Moriya S, Shibata C. Transfer Learning Method for Very Deep CNN for Text Classification and Methods for Its Evaluation[C]// Proceedings of the IEEE 42nd Annual Computer Software & Applications Conference. 2018: 153-158.
[23] Mikolov T, Chen K, Corrado G , et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv: 1301. 3781.
[24] Li C, Zhan G, Li Z. News Text Classification Based on Improved Bi-LSTM-CNN[C]// Proceedings of the 9th International Conference on Information Technology in Medicine and Education (ITME). IEEE, 2018: 890-893.
[25] Maldonado S, López J . Dealing with High-Dimensional Class-Imbalanced Datasets: Embedded Feature Selection for SVM Classification[J]. Applied Soft Computing, 2018,67:94-105.
doi: 10.1016/j.asoc.2018.02.051
[26] Hinton G E, Krizhevsky A, Wang S D. Transforming Auto-encoders[C]// Proceedings of the 21st International Conference on Artificial Neural Networks. Springer, 2011: 44-51.
[27] Kingma D P, Ba J . Adam: A Method for Stochastic Optimization[OL]. arXiv Preprint, arXiv: 1412. 6980.
[1] Chen Jie,Ma Jing,Li Xiaofeng. Short-Text Classification Method with Text Features from Pre-trained Models[J]. 数据分析与知识发现, 2021, 5(9): 21-30.
[2] Zhou Zeyu,Wang Hao,Zhao Zibo,Li Yueyan,Zhang Xiaoqin. Construction and Application of GCN Model for Text Classification with Associated Information[J]. 数据分析与知识发现, 2021, 5(9): 31-41.
[3] Ma Jiangwei, Lv Xueqiang, You Xindong, Xiao Gang, Han Junmei. Extracting Relationship Among Military Domains with BERT and Relation Position Features[J]. 数据分析与知识发现, 2021, 5(8): 1-12.
[4] Li Wenna, Zhang Zhixiong. Entity Alignment Method for Different Knowledge Repositories with Joint Semantic Representation[J]. 数据分析与知识发现, 2021, 5(7): 1-9.
[5] Wang Hao, Lin Kerou, Meng Zhen, Li Xinlei. Identifying Multi-Type Entities in Legal Judgments with Text Representation and Feature Generation[J]. 数据分析与知识发现, 2021, 5(7): 10-25.
[6] Yu Xuehan, He Lin, Xu Jian. Extracting Events from Ancient Books Based on RoBERTa-CRF[J]. 数据分析与知识发现, 2021, 5(7): 26-35.
[7] Lu Quan, He Chao, Chen Jing, Tian Min, Liu Ting. A Multi-Label Classification Model with Two-Stage Transfer Learning[J]. 数据分析与知识发现, 2021, 5(7): 91-100.
[8] Liu Wenbin, He Yanqing, Wu Zhenfeng, Dong Cheng. Sentence Alignment Method Based on BERT and Multi-similarity Fusion[J]. 数据分析与知识发现, 2021, 5(7): 48-58.
[9] Yin Pengbo,Pan Weimin,Zhang Haijun,Chen Degang. Identifying Clickbait with BERT-BiGA Model[J]. 数据分析与知识发现, 2021, 5(6): 126-134.
[10] Song Ruoxuan,Qian Li,Du Yu. Identifying Academic Creative Concept Topics Based on Future Work of Scientific Papers[J]. 数据分析与知识发现, 2021, 5(5): 10-20.
[11] Hu Haotian,Ji Jinfeng,Wang Dongbo,Deng Sanhong. An Integrated Platform for Food Safety Incident Entities Based on Deep Learning[J]. 数据分析与知识发现, 2021, 5(3): 12-24.
[12] Wang Qian,Wang Dongbo,Li Bin,Xu Chao. Deep Learning Based Automatic Sentence Segmentation and Punctuation Model for Massive Classical Chinese Literature[J]. 数据分析与知识发现, 2021, 5(3): 25-34.
[13] Chang Chengyang,Wang Xiaodong,Zhang Shenglei. Polarity Analysis of Dynamic Political Sentiments from Tweets with Deep Learning Method[J]. 数据分析与知识发现, 2021, 5(3): 121-131.
[14] Dong Miao, Su Zhongqi, Zhou Xiaobei, Lan Xue, Cui Zhigang, Cui Lei. Improving PubMedBERT for CID-Entity-Relation Classification Using Text-CNN[J]. 数据分析与知识发现, 2021, 5(11): 145-152.
[15] Liu Huan,Zhang Zhixiong,Wang Yufei. A Review on Main Optimization Methods of BERT[J]. 数据分析与知识发现, 2021, 5(1): 3-15.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938   E-mail:jishu@mail.las.ac.cn