Position-Aware Stepwise Tagging Method for Triples Extraction of Entity-Relationship
Wang Yuan1,Shi Kaize1,2,Niu Zhendong1,3()
1School of Computer Science & Technology, Beijing Institute of Technology, Beijing 100081, China 2Australian Artificial Intelligence Institute, University of Technology Sydney, Sydney 2007, Australia 3Beijing Institute of Technology Library, Beijing 100081, China
[Objective] This paper designs a joint model for overlapping scenes, aiming to effectively extract triples from unstructured texts. [Methods] We designed a tagging method with position-aware stepwise technique. First, the main entities were determined by tagging their start and end positions. Then, we tagged the corresponding objects under each predefined relations. We also added multiple position-aware information to the tagging procedures. Finally, we shared the encoded sequences with the pre-order results and the attention mechanism. [Results] We examined our new model with DuIE, a Chinese public dataset. The performance of our method is better than those of the baseline models, with an F1 value of 0.886. We also verified the effectiveness of the model’s components through ablation studies. [Limitations] More research is needed to investigate the occasionally nested entities. [Conclusions] The proposed method could effectively address the issues facing triple extraction for overlapping scenes, and provide reference for future studies.
Hinton G E, Salakhutdinov R R. Reducing the Dimensionality of Data with Neural Networks[J]. Science, 2006, 313(5786): 504-507.
Socher R, Huval B, Manning C D, et al. Semantic Compositionality Through Recursive Matrix-Vector Spaces [C]//Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. 2012: 1201-1211.
Hashimoto K, Miwa M, Tsuruoka Y, et al. Simple Customization of Recursive Neural Networks for Semantic Relation Classification [C]//Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing. 2013: 1372-1376.
Zeng D J, Liu K, Lai S W, et al. Relation Classification via Convolutional Deep Neural Network [C]//Proceedings of the 25th International Conference on Computational Linguistics: Technical Papers. 2014: 2335-2344.
Santos C N D, Xiang B, Zhou B W. Classifying Relations by Ranking with Convolutional Neural Networks [C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2015: 626-634.
Wang L L, Cao Z, de Melo G, et al. Relation Classification via Multi-level Attention CNNs [C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016: 1298-1307.
(Gao Dan, Peng Dunlu, Liu Cong. Entity Relation Extraction Based on CNN in Large-scale Text Data[J]. Journal of Chinese Computer Systems, 2018, 39(5): 1021-1026.)
Xu Y, Mou L L, Li G, et al. Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths [C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015: 1785-1794.
Zhang S, Zheng D Q, Hu X C, et al. Bidirectional Long Short-Term Memory Networks for Relation Classification [C]//Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation. 2015: 73-78.
Zhang Y H, Qi P, Manning C D. Graph Convolution over Pruned Dependency Trees Improves Relation Extraction[OL]. arXiv Preprint, arXiv: 1809.10185.
Guo Z J, Zhang Y, Lu W. Attention Guided Graph Convolutional Networks for Relation Extraction[OL]. arXiv Preprint, arXiv:1906.07510.
Li Z H, Yang Z H, Shen C, et al. Integrating Shortest Dependency Path and Sentence Sequence into a Deep Learning Framework for Relation Extraction in Clinical Text[J]. BMC Medical Informatics and Decision Making, 2019, 19(1): Article No.22.
(Chen Yu, Zheng Dequan, Zhao Tiejun. Chinese Relation Extraction Based on Deep Belief Nets[J]. Journal of Software, 2012, 23(10): 2572-2585.)
Miwa M, Bansal M. End-to-End Relation Extraction Using LSTMs on Sequences and Tree Structures[OL]. arXiv Preprint, arXiv: 1601.00770.
Katiyar A, Cardie C. Going Out on a Limb: Joint Extraction of Entity Mentions and Relations Without Dependency Trees [C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017: 917-928.
Zheng S C, Wang F, Bao H Y, et al. Joint Extraction of Entities and Relations Based on a Novel Tagging Scheme[OL]. arXiv Preprint, arXiv: 1706.05075.
Bekoulis G, Deleu J, Demeester T, et al. Joint Entity Recognition and Relation Extraction as a Multi-head Selection Problem[J]. Expert Systems with Applications, 2018, 114: 34-45.
Wang S L, Zhang Y, Che W X, et al. Joint Extraction of Entities and Relations Based on a Novel Graph Scheme [C]//Proceedings of the 27th International Joint Conference on Artificial Intelligence. 2018: 4461-4467.
Fu T J, Li P H, Ma W Y. GraphRel: Modeling Text as Relational Graphs for Joint Entity and Relation Extraction [C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019: 1409-1418.
Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv: 1810.04805.
Yu B W, Zhang Z Y, Shu X L, et al. Joint Extraction of Entities and Relations Based on a Novel Decomposition Strategy[OL]. arXiv Preprint, arXiv:1909.04273.
Wang J, Lu W. Two are Better than One: Joint Entity and Relation Extraction with Table-Sequence Encoders[OL]. arXiv:2010.03851.
Huang W P, Cheng X Y, Wang T F, et al. BERT-Based Multi-Head Selection for Joint Entity-Relation Extraction [C]//Proceedings of CCF International Conference on Natural Language Processing and Chinese Computing. Springer, Cham. 2019: 713-723.
Hu Z S, Yin H, Xu G L, et al. An Empirical Study on Joint Entities-Relations Extraction of Chinese Text Based on BERT [C]//Proceedings of the 12th International Conference on Machine Learning and Computing. 2020: 473-478.