Data Analysis and Knowledge Discovery  2021, Vol. 5 Issue (11): 80-88    DOI: 10.11925/infotech.2096-3467.2021.0347
 Current Issue | Archive | Adv Search |
Analyzing Implicit Discourse Relation with Single Classifier and Multi-Task Network
Wang Hong,Shu Zhan,Gao Yinquan,Tian Wenhong()
School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu 610054, China
Yangtze Delta Region Institute of University of Electronic Science and Technology of China, Huzhou 313001, China
 Download: PDF (1046 KB)   HTML ( 8 )  Export: BibTeX | EndNote (RIS)
Abstract

[Objective] This paper proposes a new method to identify implicit discourse relations based on a single classifier and multi-task learning model. [Methods] First,we modeled the implicit and explicit discourse relationships with the multi-task learning method. Then, we converted the four classification problems to two and trained the single classifier. [Results] We examined our new method with the HIT-CDTB data set. For the corpus with extended and parallel relations, the F1 values reached 0.94 and 0.81 respectively, which were significantly improved with four inter-sentence relations. [Limitations] The performance of our model could be improved with more distributed and expanded datasets. [Conclusions] The proposed method yields the best results with the HIT-CDTB data set. Deleting connectives will add noise to the training set and negatively affect the model’s performance.

Received: 08 April 2021      Published: 26 August 2021
 ZTFLH: TP391
Fund:Key Research and Development Program of Ministry of Science and Technology(2018AAA0103203)
Corresponding Authors: Tian Wenhong,ORCID：0000-0002-5551-9796     E-mail: tian_wenhong@uestc.edu.cn
 f x = w 1 x 2 + w 2 x + b Model Loss"> $f x = w 1 x 2 + w 2 x + b$ Model Loss f x = w 1 x + b Model Loss"> $f x = w 1 x + b$ Model Loss Single Classifier Learning Structure Multi-task Learning Structure Corpus Statistics Recognition Results of Implicit Inter Sentence Relationships Comparative Experiment of RNN-Att Model Multi-task Network Structure Single Classifier Comparison Experiment 1 Single Classifier Comparison Experiment 2