%A Yu Bengong, Zhu Mengdi %T Question Classification Based on Bidirectional GRU with Hierarchical Attention and Multi-channel Convolution %0 Journal Article %D 2020 %J Data Analysis and Knowledge Discovery %R 10.11925/infotech.2096-3467.2019.1292 %P 50-62 %V 4 %N 8 %U {https://manu44.magtech.com.cn/Jwk_infotech_wk3/CN/abstract/article_4804.shtml} %8 2020-08-25 %X

[Objective] This paper proposes a method to extract multi-level features from the question texts, aiming to better understand their semantics and address the issues facing text classification. [Methods] First, we constructed multi-channel attention feature matrices based on the multi-feature attention mechanism at the word level. It enriched the semantic representation of the texts and fully utilized the interrogative words, properties and position features from the questions. Then, we convolved the new matrices to obtain phrase-level feature representation. Third, we rearranged the vector representation and fed data to the bidirectional GRU(Gated Recurrent Unit) to access forward and backward semantic features respectively. Finally, we applied the latent topic attention to strengthen the topic information in the bidirectional contextual features, and generated the final text vector for the classification results. [Results] The accuracy rates of proposed model with three Chinese question datasets were 93.89%, 94.47% and 94.23% respectively, which were 5.82% and 4.50% higher than those of the LSTM and CNN. [Limitations] We only examined our new model with three Chinese question corpus. [Conclusions] The proposed model fully understands the semantic features of question texts, and improves the performance of question classification.