%A Xu Tongtong,Sun Huazhi,Ma Chunmei,Jiang Lifen,Liu Yichen %T Classification Model for Few-shot Texts Based on Bi-directional Long-term Attention Features %0 Journal Article %D 2020 %J Data Analysis and Knowledge Discovery %R 10.11925/infotech.2096-3467.2020.0206 %P 113-123 %V 4 %N 10 %U {https://manu44.magtech.com.cn/Jwk_infotech_wk3/CN/abstract/article_4931.shtml} %8 2020-10-25 %X

[Objective] This paper proposes a classification model for few-shot texts, aiming to address the issues of data scarcity and low generalization performance.[Methods] First, we divided the text classification tasks into multiple subtasks based on episode training mechanism in meta-learning. Then, we proposed a Bi-directional Temporal Convolutional Network (Bi-TCN) to capture the long-term contextual information of the text in each subtask. Third, we developed a Bi-directional Long-term Attention Network (BLAN) to capture more discriminative features based on Bi-TCN and multi-head attention mechanism. Finally, we used the Neural Tensor Network to measure the correlation between query samples and support set of each subtask to finish few-shot text classification.[Results] We examined our model with the ARSC dataset. The classification accuracy of this model reached 86.80% in few-shot learning setting, which was 3.68% and 1.17% better than those of the ROBUSTTC-FSL and Induction-Network-Routing models.[Limitations] The performance of BLAN on long text is not satisfactory. [Conclusions] BLAN overcomes the issue of data scarcity and captures comprehensive text features, which effectively improves the performance of few-shot text classification.