Please wait a minute...
Data Analysis and Knowledge Discovery  2023, Vol. 7 Issue (9): 114-124    DOI: 10.11925/infotech.2096-3467.2022.0918
Current Issue | Archive | Adv Search |
Detecting Events with SPO Semantic and Syntactic Information
He Li,Yang Meihua(),Liu Luyao
College of Science and Technology, Tianjin University of Finance and Economics, Tianjin 300222, China
Download: PDF (2592 KB)   HTML ( 11
Export: BibTeX | EndNote (RIS)      
Abstract  

[Objective] This paper utilizes the SPO triples and the dependency syntax to improve the performance of the event detection model.[Methods] We constructed an event detection model EDMC3S combining the semantic information of SPO triples and the type information of dependency syntactic relationship. First, the model generates SPO triples and dependency syntax relation type weight matrix based on the dependency syntax tree of the sentence. Then, we used a multi-head attention mechanism to strengthen the semantic features of SPO triples and a self-attention mechanism to distribute the weight of different dependency relation types. Third, we extracted the global syntactic and semantic features through the multi-order graph attention aggregation network. Finally, we integrated the semantic features of SPO triples and the global features of statements with a connection layer. [Results] We examined the new model on the ACE2005 dataset, and it achieved better classification performance in the two sub-tasks of trigger word recognition and event type classification. On the three evaluation indexes of P, R, and F1, the recognition of trigger words reached 80.6%, 82.4%, and 81.5%, respectively, and the classification of event types reached 78.7%, 80.1%, and 79.4%, respectively. [Limitations] We need to evaluate the new model with more datasets. [Conclusions] The proposed model can improve the effect of event detection in trigger word recognition and event type classification.

Key wordsEvent Detection      SPO Semantic Information      Syntactic Information      Attention Mechanism      Multi-Order Graph Attention Aggregation Networks     
Received: 30 August 2022      Published: 21 March 2023
ZTFLH:  TP391  
  G350  
Fund:The National Social Science Fund of China(19CGL025)
Corresponding Authors: Yang Meihua,ORCID:0000-0001-6789-3593,E-mail:MeiHuaCandice@163.com。   

Cite this article:

He Li, Yang Meihua, Liu Luyao. Detecting Events with SPO Semantic and Syntactic Information. Data Analysis and Knowledge Discovery, 2023, 7(9): 114-124.

URL:

https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/10.11925/infotech.2096-3467.2022.0918     OR     https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/Y2023/V7/I9/114

EDMC3S Model Structure
SPO Triples Generation Example
Attention Aggregation of Three-Order Graph Attention Networks Module
参数名 数量 参数名 数量
词向量 300维 多头注意力头数 3
命名实体识别类型向量 50维 语句最大长度 80
词性向量 50维 依存关系类型 95种
位置向量 50维 Dropout 0.2
GAT阶数 3 学习率 2E-5
Model Parameter Settings
模型名称 触发词识别 事件类型分类
P R F1 P R F1
DMCNN 80.4 67.7 73.5 75.6 63.6 69.1
Bi-RNN 68.5 75.7 71.9 66.0 73.0 69.3
HNN 80.8 71.5 75.9 84.6 64.9 73.4
GCN-ED 74.9 75.6 75.2 77.9 68.8 73.1
MOGANED 78.8 76.7 77.7 79.5 72.3 75.7
GatedGCN 78.7 79.5 79.1 78.8 76.3 77.6
EDMC3S 80.6 82.4 81.5 78.7 80.1 79.4
Performance Comparison of Among Different Models (%)
模型名称 触发词识别 事件类型分类
P R F1 P R F1
MOGANED(附关系标签) 78.1 76.5 77.3 77.1 76.3 76.7
w/o SPO三元组 78.3 79.5 78.9 76.7 78.1 77.4
w/o SPO三元组+多头注意力模块 79.6 80.8 80.2 78.4 78.6 78.5
w/o依存关系类型自注意力模块 78.4 80.2 79.3 76.9 78.9 77.9
EDMC3S 80.6 82.4 81.5 78.7 80.1 79.4
Performance Comparison in Ablation Study (%)
Visualization of Self-Attention Results
[1] Liang Z Z, Noriega-Atala E, Morrison C, et al. Low Resource Causal Event Detection from Biomedical Literature[C]// Proceedings of the 21st Workshop on Biomedical Language Processing. PA, USA: ACL, 2022: 252-263.
[2] Wang Y, Xia N, Luo X F, et al. Global Semantics with Boundary Constraint Knowledge Graph for Chinese Financial Event Detection[C]// Proceedings of the IEEE International Conference on Big Knowledge. IEEE, 2022: 281-289.
[3] Alfalqi K, Bellaiche M. An Emergency Event Detection Ensemble Model Based on Big Data[J]. Big Data and Cognitive Computing, 2022, 6(2): Article No.42.
[4] Liu X, Luo Z C, Huang H Y. Jointly Multiple Events Extraction via Attention-Based Graph Information Aggregation[OL]. arXiv Preprint, arXiv: 1809.09078.
[5] Kipf T N, Welling M. Semi-Supervised Classification with Graph Convolutional Networks[OL]. arXiv Preprint, arXiv: 1609.02907.
[6] Yan H R, Jin X L, Meng X B, et al. Event Detection with Multi-Order Graph Convolution and Aggregated Attention[C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. PA, USA: ACL, 2019: 5766-5770.
[7] Tong M H, Xu B, Hou L, et al. Leveraging Multi-Head Attention Mechanism to Improve Event Detection[A]//Sun M, Huang X, Ji H, et al. China National Conference on Chinese Computational Linguistics[M]. Cham: Springer, 2019: 268-280.
[8] Zhang Z C, Zhang R F. Combined Self-Attention Mechanism for Biomedical Event Trigger Identification[C]// Proceedings of the IEEE International Conference on Bioinformatics and Biomedicine. IEEE, 2020: 1009-1012.
[9] Grishman R, Westbrook D, Meyers A. NYU’s English ACE 2005 System Description[J]. Journal on Satisfiability, Boolean Modeling and Computation, 2005, 51(11): 1927-1928.
[10] Ahn D. The Stages of Event Extraction[C]// Proceedings of the 2006 Workshop on Annotating and Reasoning About Time and Events. NJ, USA: ACL, 2006: 1-8.
[11] Ji H, Grishman R. Refining Event Extraction Through Cross-Document Inference[C]// Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics. 2008: 254-262.
[12] Liao S S, Grishman R. Using Document Level Cross-Event Inference to Improve Event Extraction[C]// Proceedings of the 48th Annual Meeting of the Association for Computational Linguistics. New York: ACM, 2010: 789-797.
[13] Hong Y, Zhang J F, Ma B, et al. Using Cross-Entity Inference to Improve Event Extraction[C]// Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics:Human Language Technologies. 2011: 1127-1136.
[14] Li Q, Ji H, Huang L. Joint Event Extraction via Structured Prediction with Global Features[C]// Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. 2013: 73-82.
[15] Nguyen T H, Grishman R. Event Detection and Domain Adaptation with Convolutional Neural Networks[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. PA, USA: ACL, 2015: 365-371.
[16] Chen Y B, Xu L H, Liu K, et al. Event Extraction via Dynamic Multi-Pooling Convolutional Neural Networks[C]// Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing. PA, USA: ACL, 2015: 167-176.
[17] Nguyen T H, Cho K, Grishman R. Joint Event Extraction via Recurrent Neural Networks[C]// Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. PA, USA: ACL, 2016: 300-309.
[18] Rahul P V S S, Sahu S K, Anand A. Biomedical Event Trigger Identification Using Bidirectional Recurrent Neural Network Based Models[OL]. arXiv Preprint, arXiv: 1705.09516.
[19] Hochreiter S, Schmidhuber J. Long Short-Term Memory[J]. Neural Computation, 1997, 9(8): 1735-1780.
doi: 10.1162/neco.1997.9.8.1735 pmid: 9377276
[20] Chung J, Gulcehre C, Cho K, et al. Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling[OL]. arXiv Preprint, arXiv: 1412.3555.
[21] Feng X C, Qin B, Liu T. A Language-Independent Neural Network for Event Detection[J]. Science China Information Sciences, 2018, 61(9): Article No.092106.
[22] Bahdanau D, Cho K, Bengio Y. Neural Machine Translation by Jointly Learning to Align and Translate[OL]. arXiv Preprint, arXiv: 1409.0473.
[23] Vaswani A, Shazeer N, Parmar N, et al. Attention is All You Need[C]// Proceedings of the 31st International Conference on Neural Information Processing Systems. New York: ACM, 2017: 6000-6010.
[24] Li L S, Liu Y. Exploiting Argument Information to Improve Biomedical Event Trigger Identification via Recurrent Neural Networks and Supervised Attention Mechanisms[C]// Proceedings of the 2017 IEEE International Conference on Bioinformatics and Biomedicine. IEEE, 2017: 565-568.
[25] Mu X F, Xu A P. A Character-Level BiLSTM-CRF Model with Multi-Representations for Chinese Event Detection[J]. IEEE Access, 2019, 7: 146524-146532.
doi: 10.1109/Access.6287639
[26] 余传明, 林虹君, 张贞港. 基于多任务深度学习的实体和事件联合抽取模型[J]. 数据分析与知识发现, 2022, 6(2/3): 117-128.
[26] (Yu Chuanming, Lin Hongjun, Zhang Zhengang. Joint Extraction Model for Entities and Events with Multi-Task Deep Learning[J]. Data Analysis and Knowledge Discovery, 2022, 6(2/3): 117-128.)
[27] Nguyen T, Grishman R. Graph Convolutional Networks with Argument-Aware Pooling for Event Detection[C]// Proceedings of the 32nd AAAI Conference on Artificial Intelligence and 30th Innovative Applications of Artificial Intelligence Conference and 8th AAAI Symposium on Educational Advances in Artificial Intelligence. 2018: 5900-5907.
[28] Guo Z J, Zhang Y, Lu W. Attention Guided Graph Convolutional Networks for Relation Extraction[OL]. arXiv Preprint, arXiv: 1906.07510.
[29] Veličković P, Cucurull G, Casanova A, et al. Graph Attention Networks[OL]. arXiv Preprint, arXiv: 1710.10903.
[30] Cui S Y, Yu B W, Liu T W, et al. Edge-Enhanced Graph Convolution Networks for Event Detection with Syntactic Relation[OL]. arXiv Preprint, arXiv: 2002.10757.
[31] 欧阳纯萍, 邹康, 刘永彬, 等. 融合多跳关系标签与依存句法结构信息的事件检测模型[J]. 计算机应用研究, 2022, 39(1): 43-47.
[31] (Ouyang Chunping, Zou Kang, Liu Yongbin, et al. Event Detection Model Based on Integrating of Multi-Hop Relation Labels and Dependency Syntactic Structure[J]. Application Research of Computers, 2022, 39(1): 43-47.)
[32] Lai V D, Nguyen T N, Nguyen T H. Event Detection: Gate Diversity and Syntactic Importance Scores for Graph Convolution Neural Networks[C]// Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing. 2020: 5405-5411.
[33] Li Q, Ji H, Huang L. Joint Event Extraction via Structured Prediction with Global Features[C]// Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics. 2013: 73-82.
[34] 陈佳丽, 洪宇, 王捷, 等. 利用门控机制融合依存与语义信息的事件检测方法[J]. 中文信息学报, 2020, 34(8): 51-60.
[34] (Chen Jiali, Hong Yu, Wang Jie, et al. Combination of Dependency and Semantic Information via Gated Mechanism for Event Detection[J]. Journal of Chinese Information Processing, 2020, 34(8): 51-60.)
[35] Mikolov T, Chen K, Corrado G, et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv: 1301.3781.
[36] Qi P, Dozat T, Zhang Y H, et al. Universal Dependency Parsing from Scratch[OL]. arXiv Preprint, arXiv: 1901.10457.
[37] Finkel J R, Grenager T, Manning C. Incorporating Non-Local Information into Information Extraction Systems by Gibbs Sampling[C]// Proceedings of the 43rd Annual Meeting on Association for Computational Linguistics. NJ, USA: ACL, 2005: 363-370.
[1] Han Pu, Gu Liang, Ye Dongyu, Chen Wenqi. Recognizing Chinese Medical Literature Entities Based on Multi-Task and Transfer Learning[J]. 数据分析与知识发现, 2023, 7(9): 136-145.
[2] Han Pu, Zhong Yule, Lu Haojie, Ma Shiwen. Identifying Named Entities of Adverse Drug Reaction with Adversarial Transfer Learning[J]. 数据分析与知识发现, 2023, 7(3): 131-141.
[3] Zhou Ning, Zhong Na, Jin Gaoya, Liu Bin. Chinese Text Sentiment Analysis Based on Dual Channel Attention Network with Hybrid Word Embedding[J]. 数据分析与知识发现, 2023, 7(3): 58-68.
[4] Su Mingxing, Wu Houyue, Li Jian, Huang Ju, Zhang Shunxiang. AEMIA:Extracting Commodity Attributes Based on Multi-level Interactive Attention Mechanism[J]. 数据分析与知识发现, 2023, 7(2): 108-118.
[5] Wang jinzheng, Yang Ying, Yu Bengong. Classifying Customer Complaints Based on Multi-head Co-attention Mechanism[J]. 数据分析与知识发现, 2023, 7(1): 128-137.
[6] Peng Cheng, Zhang Chunxia, Zhang Xin, Guo Jingtao, Niu Zhendong. Reasoning Model for Temporal Knowledge Graph Based on Entity Multiple Unit Coding[J]. 数据分析与知识发现, 2023, 7(1): 138-149.
[7] Zhao Ruijie, Tong Xinyu, Liu Xiaohua, Lu Yonghe. Entity Recognition and Labeling for Medical Literature Based on Neural Network[J]. 数据分析与知识发现, 2022, 6(9): 100-112.
[8] Chen Yuanyuan, Ma Jing. Detecting Multimodal Sarcasm Based on SC-Attention Mechanism[J]. 数据分析与知识发现, 2022, 6(9): 40-51.
[9] Tang Jiao, Zhang Lisheng, Sang Chunyan. News Recommendation with Latent Topic Distribution and Long and Short-Term User Representations[J]. 数据分析与知识发现, 2022, 6(9): 52-64.
[10] Zhao Pengwu, Li Zhiyi, Lin Xiaoqi. Identifying Relationship of Chinese Characters with Attention Mechanism and Convolutional Neural Network[J]. 数据分析与知识发现, 2022, 6(8): 41-51.
[11] Zhang Ruoqi, Shen Jianfang, Chen Pinghua. Session Sequence Recommendation with GNN, Bi-GRU and Attention Mechanism[J]. 数据分析与知识发现, 2022, 6(6): 46-54.
[12] Ye Han,Sun Haichun,Li Xin,Jiao Kainan. Classification Model for Long Texts with Attention Mechanism and Sentence Vector Compression[J]. 数据分析与知识发现, 2022, 6(6): 84-94.
[13] Guo Hangcheng, He Yanqing, Lan Tian, Wu Zhenfeng, Dong Cheng. Identifying Moves from Scientific Abstracts Based on Paragraph-BERT-CRF[J]. 数据分析与知识发现, 2022, 6(2/3): 298-307.
[14] Xu Yuemei, Fan Zuwei, Cao Han. A Multi-Task Text Classification Model Based on Label Embedding of Attention Mechanism[J]. 数据分析与知识发现, 2022, 6(2/3): 105-116.
[15] Yu Chuanming, Lin Hongjun, Zhang Zhengang. Joint Extraction Model for Entities and Events with Multi-task Deep Learning[J]. 数据分析与知识发现, 2022, 6(2/3): 117-128.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938   E-mail:jishu@mail.las.ac.cn