1 School of Software, Dalian University of Technology, Dalian 116620, China 2 International Office, Dalian University of Technology, Dalian 116024, China 3 School of Computer Science and Technology, Dalian University of Technology, Dalian 116023, China
[Objective] This paper proposes a new method to address the issues facing semantic information and relationship representation, aiming to improve the recognition of noun metaphors. [Methods] First, we used the BERT model to replace the word vector, and added position relationship among words for the semantic representation. Then, we utilized the Transformer model to extract features. Finally, we identified the noun metaphors with the help of used neural network classifier. [Results] The proposed model got the highest scores in accuracy (0.900 0), precision (0.896 4), recall (0.885 8), and F1(0.891 0). It covered multiple key points to improve the classification results of noun metaphors. [Limitations] The proposed method could not process the Chinese ancient idioms, as well as rare or dummy vocabularies. [Conclusions] The proposed model could more effectively identify Noun Metaphors than the existing models based on artificial features and deep learnings.
( Tian Jia, Su Chang, Chen Yijiang . Computational Metaphor Processing[J]. Journal of Software, 2015,26(1):40-51.)
[5]
Brunner G, Liu Y, Pascual D , et al. On the Validity of Self-Attention as Explanation in Transformer Models[OL]. arXiv Preprint, arXiv: 1908. 04211.
[6]
Khandelwal U, Clark K, Jurafsky D , et al. Sample Efficient Text Summarization Using a Single Pre-Trained Transformer[OL]. arXiv Preprint, arXiv: 1905. 08836.
[7]
Liu J, Cohen S B, Lapata M. Discourse Representation Structure Parsing with Recurrent Neural Networks and the Transformer Model[C]// Proceedings of the 2019 IWCS Shared Task on Semantic Parsing. 2019.
[8]
Yang W, Xie Y, Lin A , et al. End-to-End Open-Domain Question Answering with Bertserini[OL]. arXiv Preprint, arXiv: 1902. 01718.
[9]
Alberti C, Lee K, Collins M . A Bert Baseline for the Natural Questions[OL]. arXiv Preprint, arXiv: 1901. 08634.
[10]
Xu P, Ma X, Nallapati R , et al. Passage Ranking with Weak Supervision[OL]. arXiv Preprint, arXiv: 1905. 05910.
[11]
Fass D . Met: A Method for Discriminating Metonymy and Metaphor by Computer[J]. Computational Linguistics, 1991,17(1):49-90.
[12]
Wilks Y, Dalton A, Allen J, et al. Automatic Metaphor Detection Using Large-Scale Lexical Resources and Conventional Metaphor Extraction[C]// Proceedings of the 1st Workshop on Metaphor in NLP, Atlanta, Georgia, USA. 2013: 36-44.
[13]
Jang H, Moon S, Jo Y, et al. Metaphor Detection in Discourse[C]// Proceedings of the 16th Annual Meeting of the Special Interest Group on Discourse and Dialogue, New York, USA. 2015: 384-392.
[14]
Rai S, Chakraverty S, Tayal D K , et al. A Study on Impact of Context on Metaphor Detection[J]. The Computer Journal, 2018,61(11):1667-1682.
[15]
Do Dinh E L, Gurevych I. Token-level Metaphor Detection Using Neural Networks[C]// Proceedings of the 4th Workshop on Metaphor in NLP, San Diego, California, USA. 2016: 28-33.
[16]
Sun S, Xie Z. BiLSTM-Based Models for Metaphor Detection[C]// Proceedings of the 2017 National CCF Conference on Natural Language Processing and Chinese Computing (NLPCC2017), Dalian, China. 2017: 431-442.
( Wang Mengxiang, Rao Qi, Gu Cheng , et al. Metaphorical Knowledge Expression and Acquisition for Chinese Nouns[J]. Journal of Chinese Information Processing, 2017,31(6):1-9.)
[18]
Bizzoni Y, Ghanimifard M. Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor Detection[C]// Proceedings of the 2018 Workshop on Figurative Language Processing, Louisiana, USA. 2018: 91-101.
[19]
Gao G, Choi E, Choi Y , et al. Neural Metaphor Detection in Context[OL]. arXiv Preprint, arXiv: 1808. 09653.
[20]
Devlin J, Chang M W, Lee K , et al. Bert: Pre-training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv: 1810. 04805.
[21]
Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017: 5998-6008.
[22]
Moriya S, Shibata C. Transfer Learning Method for Very Deep CNN for Text Classification and Methods for Its Evaluation[C]// Proceedings of the IEEE 42nd Annual Computer Software & Applications Conference. 2018: 153-158.
[23]
Mikolov T, Chen K, Corrado G , et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv: 1301. 3781.
[24]
Li C, Zhan G, Li Z. News Text Classification Based on Improved Bi-LSTM-CNN[C]// Proceedings of the 9th International Conference on Information Technology in Medicine and Education (ITME). IEEE, 2018: 890-893.
Hinton G E, Krizhevsky A, Wang S D. Transforming Auto-encoders[C]// Proceedings of the 21st International Conference on Artificial Neural Networks. Springer, 2011: 44-51.
[27]
Kingma D P, Ba J . Adam: A Method for Stochastic Optimization[OL]. arXiv Preprint, arXiv: 1412. 6980.