|
|
Position-Aware Stepwise Tagging Method for Triples Extraction of Entity-Relationship |
Wang Yuan1,Shi Kaize1,2,Niu Zhendong1,3( ) |
1School of Computer Science & Technology, Beijing Institute of Technology, Beijing 100081, China 2Australian Artificial Intelligence Institute, University of Technology Sydney, Sydney 2007, Australia 3Beijing Institute of Technology Library, Beijing 100081, China |
|
|
Abstract [Objective] This paper designs a joint model for overlapping scenes, aiming to effectively extract triples from unstructured texts. [Methods] We designed a tagging method with position-aware stepwise technique. First, the main entities were determined by tagging their start and end positions. Then, we tagged the corresponding objects under each predefined relations. We also added multiple position-aware information to the tagging procedures. Finally, we shared the encoded sequences with the pre-order results and the attention mechanism. [Results] We examined our new model with DuIE, a Chinese public dataset. The performance of our method is better than those of the baseline models, with an F1 value of 0.886. We also verified the effectiveness of the model’s components through ablation studies. [Limitations] More research is needed to investigate the occasionally nested entities. [Conclusions] The proposed method could effectively address the issues facing triple extraction for overlapping scenes, and provide reference for future studies.
|
Received: 26 March 2021
Published: 23 November 2021
|
|
Fund:National Key R&D Program of China(2019YFB1406302);National Key R&D Program of China(2019YFB1406303) |
Corresponding Authors:
Niu Zhendong,ORCID:0000-0002-0576-7572
E-mail: zniu@bit.edu.cn
|
[1] |
Hinton G E, Salakhutdinov R R. Reducing the Dimensionality of Data with Neural Networks[J]. Science, 2006, 313(5786): 504-507.
pmid: 16873662
|
[2] |
Socher R, Huval B, Manning C D, et al. Semantic Compositionality Through Recursive Matrix-Vector Spaces [C]//Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning. 2012: 1201-1211.
|
[3] |
Hashimoto K, Miwa M, Tsuruoka Y, et al. Simple Customization of Recursive Neural Networks for Semantic Relation Classification [C]//Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing. 2013: 1372-1376.
|
[4] |
Zeng D J, Liu K, Lai S W, et al. Relation Classification via Convolutional Deep Neural Network [C]//Proceedings of the 25th International Conference on Computational Linguistics: Technical Papers. 2014: 2335-2344.
|
[5] |
Santos C N D, Xiang B, Zhou B W. Classifying Relations by Ranking with Convolutional Neural Networks [C]//Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers). 2015: 626-634.
|
[6] |
Wang L L, Cao Z, de Melo G, et al. Relation Classification via Multi-level Attention CNNs [C]//Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2016: 1298-1307.
|
[7] |
孙建东, 顾秀森, 李彦, 等. 基于COAE2016数据集的中文实体关系抽取算法研究[J]. 山东大学学报(理学版), 2017, 52(9): 7-12.
|
[7] |
(Sun Jiandong, Gu Xiusen, Li Yan, et al. Chinese Entity Relation Extraction Algorithms Based on COAE2016 Datasets[J]. Journal of Shandong University (Natural Science), 2017, 52(9): 7-12.)
|
[8] |
高丹, 彭敦陆, 刘丛. 海量法律文书中基于CNN的实体关系抽取技术[J]. 小型微型计算机系统, 2018, 39(5): 1021-1026.
|
[8] |
(Gao Dan, Peng Dunlu, Liu Cong. Entity Relation Extraction Based on CNN in Large-scale Text Data[J]. Journal of Chinese Computer Systems, 2018, 39(5): 1021-1026.)
|
[9] |
Xu Y, Mou L L, Li G, et al. Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Paths [C]//Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015: 1785-1794.
|
[10] |
Zhang S, Zheng D Q, Hu X C, et al. Bidirectional Long Short-Term Memory Networks for Relation Classification [C]//Proceedings of the 29th Pacific Asia Conference on Language, Information and Computation. 2015: 73-78.
|
[11] |
Zhang Y H, Qi P, Manning C D. Graph Convolution over Pruned Dependency Trees Improves Relation Extraction[OL]. arXiv Preprint, arXiv: 1809.10185.
|
[12] |
Guo Z J, Zhang Y, Lu W. Attention Guided Graph Convolutional Networks for Relation Extraction[OL]. arXiv Preprint, arXiv:1906.07510.
|
[13] |
Li Z H, Yang Z H, Shen C, et al. Integrating Shortest Dependency Path and Sentence Sequence into a Deep Learning Framework for Relation Extraction in Clinical Text[J]. BMC Medical Informatics and Decision Making, 2019, 19(1): Article No.22.
|
[14] |
陈宇, 郑德权, 赵铁军. 基于Deep Belief Nets的中文名实体关系抽取[J]. 软件学报, 2012, 23(10): 2572-2585.
|
[14] |
(Chen Yu, Zheng Dequan, Zhao Tiejun. Chinese Relation Extraction Based on Deep Belief Nets[J]. Journal of Software, 2012, 23(10): 2572-2585.)
|
[15] |
Miwa M, Bansal M. End-to-End Relation Extraction Using LSTMs on Sequences and Tree Structures[OL]. arXiv Preprint, arXiv: 1601.00770.
|
[16] |
Katiyar A, Cardie C. Going Out on a Limb: Joint Extraction of Entity Mentions and Relations Without Dependency Trees [C]//Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2017: 917-928.
|
[17] |
Zheng S C, Wang F, Bao H Y, et al. Joint Extraction of Entities and Relations Based on a Novel Tagging Scheme[OL]. arXiv Preprint, arXiv: 1706.05075.
|
[18] |
Bekoulis G, Deleu J, Demeester T, et al. Joint Entity Recognition and Relation Extraction as a Multi-head Selection Problem[J]. Expert Systems with Applications, 2018, 114: 34-45.
doi: 10.1016/j.eswa.2018.07.032
|
[19] |
Wang S L, Zhang Y, Che W X, et al. Joint Extraction of Entities and Relations Based on a Novel Graph Scheme [C]//Proceedings of the 27th International Joint Conference on Artificial Intelligence. 2018: 4461-4467.
|
[20] |
Fu T J, Li P H, Ma W Y. GraphRel: Modeling Text as Relational Graphs for Joint Entity and Relation Extraction [C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. 2019: 1409-1418.
|
[21] |
Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[OL]. arXiv Preprint, arXiv: 1810.04805.
|
[22] |
Yu B W, Zhang Z Y, Shu X L, et al. Joint Extraction of Entities and Relations Based on a Novel Decomposition Strategy[OL]. arXiv Preprint, arXiv:1909.04273.
|
[23] |
Wang J, Lu W. Two are Better than One: Joint Entity and Relation Extraction with Table-Sequence Encoders[OL]. arXiv:2010.03851.
|
[24] |
DuIE Dataset [DS/OL].[2019-09-30]. http://ai.baidu.com/broad/download.
|
[25] |
Huang W P, Cheng X Y, Wang T F, et al. BERT-Based Multi-Head Selection for Joint Entity-Relation Extraction [C]//Proceedings of CCF International Conference on Natural Language Processing and Chinese Computing. Springer, Cham. 2019: 713-723.
|
[26] |
Hu Z S, Yin H, Xu G L, et al. An Empirical Study on Joint Entities-Relations Extraction of Chinese Text Based on BERT [C]//Proceedings of the 12th International Conference on Machine Learning and Computing. 2020: 473-478.
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|