|
|
Identifying Noun Metaphors with Transformer and BERT |
Zhang Dongyu1,Cui Zijuan2,Li Yingxia1,Zhang Wei1,Lin Hongfei3() |
1 School of Software, Dalian University of Technology, Dalian 116620, China 2 International Office, Dalian University of Technology, Dalian 116024, China 3 School of Computer Science and Technology, Dalian University of Technology, Dalian 116023, China |
|
|
Abstract [Objective] This paper proposes a new method to address the issues facing semantic information and relationship representation, aiming to improve the recognition of noun metaphors. [Methods] First, we used the BERT model to replace the word vector, and added position relationship among words for the semantic representation. Then, we utilized the Transformer model to extract features. Finally, we identified the noun metaphors with the help of used neural network classifier. [Results] The proposed model got the highest scores in accuracy (0.900 0), precision (0.896 4), recall (0.885 8), and F1(0.891 0). It covered multiple key points to improve the classification results of noun metaphors. [Limitations] The proposed method could not process the Chinese ancient idioms, as well as rare or dummy vocabularies. [Conclusions] The proposed model could more effectively identify Noun Metaphors than the existing models based on artificial features and deep learnings.
|
Received: 30 July 2019
Published: 01 June 2020
|
|
Corresponding Authors:
Lin Hongfei
E-mail: hflin@dlut.edu.cn
|
[1] |
Lakoff G, Johnson M. Metaphors We Live by[M]. University of Chicago Press, 2008.
|
[2] |
Richards I A. The Philosophy of Rhetoric[M]. New York: Oxford University Press, 1965.
|
[3] |
Ausubel D P . The Acquisition and Retention of Knowledge: A Cognitive View[M]. Springer Science & Business Media, 2012.
|
[4] |
田嘉, 苏畅, 陈怡疆 . 隐喻计算研究进展[J]. 软件学报, 2015,26(1):40-51.
|
[4] |
( Tian Jia, Su Chang, Chen Yijiang . Computational Metaphor Processing[J]. Journal of Software, 2015,26(1):40-51.)
|
[5] |
Brunner G, Liu Y, Pascual D , et al. On the Validity of Self-Attention as Explanation in Transformer Models[OL]. arXiv Preprint, arXiv: 1908. 04211.
|
[6] |
Khandelwal U, Clark K, Jurafsky D , et al. Sample Efficient Text Summarization Using a Single Pre-Trained Transformer[OL]. arXiv Preprint, arXiv: 1905. 08836.
|
[7] |
Liu J, Cohen S B, Lapata M. Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model[C]// Proceedings of the 2019 IWCS Shared Task on Semantic Parsing. 2019.
|
[8] |
Yang W, Xie Y, Lin A , et al. End-to-End Open-Domain Question Answering with Bertserini[OL]. arXiv Preprint, arXiv: 1902. 01718.
|
[9] |
Alberti C, Lee K, Collins M . A Bert Baseline for the Natural Questions[OL]. arXiv Preprint, arXiv: 1901. 08634.
|
[10] |
Xu P, Ma X, Nallapati R , et al. Passage Ranking with Weak Supervision[OL]. arXiv Preprint, arXiv: 1905. 05910.
|
[11] |
Fass D . Met: A Method for Discriminating Metonymy and Metaphor by Computer[J]. Computational Linguistics, 1991,17(1):49-90.
|
[12] |
Wilks Y, Dalton A, Allen J, et al. Automatic Metaphor Detection Using Large-Scale Lexical Resources and Conventional Metaphor Extraction[C]// Proceedings of the 1st Workshop on Metaphor in NLP, Atlanta, Georgia, USA. 2013: 36-44.
|
[13] |
Jang H, Moon S, Jo Y, et al. Metaphor Detection in Discourse[C]// Proceedings of the 16th Annual Meeting of the Special Interest Group on Discourse and Dialogue, New York, USA. 2015: 384-392.
|
[14] |
Rai S, Chakraverty S, Tayal D K , et al. A Study on Impact of Context on Metaphor Detection[J]. The Computer Journal, 2018,61(11):1667-1682.
|
[15] |
Do Dinh E L, Gurevych I. Token-level Metaphor Detection Using Neural Networks[C]// Proceedings of the 4th Workshop on Metaphor in NLP, San Diego, California, USA. 2016: 28-33.
|
[16] |
Sun S, Xie Z. BiLSTM-Based Models for Metaphor Detection[C]// Proceedings of the 2017 National CCF Conference on Natural Language Processing and Chinese Computing (NLPCC2017), Dalian, China. 2017: 431-442.
|
[17] |
汪梦翔, 饶琪, 顾澄 , 等. 汉语名词的隐喻知识表示及获取研究[J]. 中文信息学报, 2017,31(6):1-9.
|
[17] |
( Wang Mengxiang, Rao Qi, Gu Cheng , et al. Metaphorical Knowledge Expression and Acquisition for Chinese Nouns[J]. Journal of Chinese Information Processing, 2017,31(6):1-9.)
|
[18] |
Bizzoni Y, Ghanimifard M. Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor Detection[C]// Proceedings of the 2018 Workshop on Figurative Language Processing, Louisiana, USA. 2018: 91-101.
|
[19] |
Gao G, Choi E, Choi Y , et al. Neural Metaphor Detection in Context[OL]. arXiv Preprint, arXiv: 1808. 09653.
|
[20] |
Devlin J, Chang M W, Lee K , et al. Bert: Pre-training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv: 1810. 04805.
|
[21] |
Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 5998-6008.
|
[22] |
Moriya S, Shibata C. Transfer Learning Method for Very Deep CNN for Text Classification and Methods for Its Evaluation[C]// Proceedings of the IEEE 42nd Annual Computer Software & Applications Conference. 2018: 153-158.
|
[23] |
Mikolov T, Chen K, Corrado G , et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv: 1301. 3781.
|
[24] |
Li C, Zhan G, Li Z. News Text Classification Based on Improved Bi-LSTM-CNN[C]// Proceedings of the 9th International Conference on Information Technology in Medicine and Education (ITME). IEEE, 2018: 890-893.
|
[25] |
Maldonado S, López J . Dealing with High-Dimensional Class-Imbalanced Datasets: Embedded Feature Selection for SVM Classification[J]. Applied Soft Computing, 2018,67:94-105.
doi: 10.1016/j.asoc.2018.02.051
|
[26] |
Hinton G E, Krizhevsky A, Wang S D. Transforming Auto-encoders[C]// Proceedings of the 21st International Conference on Artificial Neural Networks. Springer, 2011: 44-51.
|
[27] |
Kingma D P, Ba J . Adam: A Method for Stochastic Optimization[OL]. arXiv Preprint, arXiv: 1412. 6980.
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|