[Objective] This study tries to address the issues facing long text representation and use CapsNet to improve the accuracy of Chinese text classification. [Methods] First, we proposed a LDA matrix and word vector to represent the long texts. Then, we constructed a Chinese classification model based on CapsNet. Third, we examined the proposed model with Sogou news corpus and the text classification corpus of Fudan University. Finally, we compared our results with those of the classic models (e.g., TextCNN, DNN and so on). [Results] The performance of CapsNet model was better than other models. The classification accuracy in five categories of short and long texts reached 89.6% and 96.9% respectively. The convergence speed of the proposed model was almost two times faster than that of the CNN model. [Limitations] The computational complexity of the model is high, which limits the size of testing corpus. [Conclusions] The proposed Chinese text representation method and the modified CapsNet model have better accuracy, convergence speed and robustness than the existing ones.
(Huang Lei, Du Changshun.Application of Recurrent Neural Networks in Text Classification[J]. Journal of Beijing University of Chemical Technology: Natural Science Edition, 2017, 44(1): 98-104.)
[4]
Sabour S, Frosst N, Hinton G E.Dynamic Routing Between Capsules[OL]. arXiv Preprint. arXiv: 1710.09829.
[5]
Salton G, Wong A, Yang C S.A Vector Space Model for Automatic Indexing[J]. Communications of the ACM,1975, 18(11): 613-620.
doi: 10.1145/361219.361220
[6]
Deerwester S, Dumais S, Furnas G W, et al.Indexing by Latent Semantic Analysis[J]. Journal of the American Society for Information Science, 1990, 41(6): 391-407.
doi: 10.1002/(ISSN)1097-4571
Blei D M, Ng A Y, Jordan M I.Latent Dirichlet Allocation[J]. Journal of Machine Learning Research, 2003, 3(2): 993-1022.
[9]
Mikolov T, Chen K, Corrado G, et al.Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint. arXiv: 1301.3781.
[10]
Joachims T.Text Categorization with Support Vector Machines: Learning with Many Relevant Features[C]// Proceedings of the 10th European Conference on Machine Learning. 1998: 137-142.
[11]
Kim Y.Convolutional Neural Networks for Sentence Classification[OL]. arXiv Preprint. arXiv: 1408.5882.
doi: 10.3115/v1/D14-1181
[12]
Kalchbrenner N, Grefenstette E, Blunsom P.A Convolutional Neural Network for Modelling Sentences[OL]. arXiv Preprint. arXiv: 1404.2188.
doi: 10.3115/v1/P14-1062
[13]
Liu P, Qiu X, Huang X.Recurrent Neural Network for Text Classification with Multi-Task Learning[C]// Proceedings of the 25th International Joint Conference on Artificial Intelligence. 2016: 2873-2879.
[14]
Joulin A, Grave E, Bojanowski P, et al.Bag of Tricks for Efficient Text Classification[C]// Proceedings of the 15th Conference of the European Chapter of the Association for Computational Linguistics. 2016: 427-431.
(Cui Jianming, Liu Jianming, Liao Zhouyu.Research of Text Categorization Based on Support Vector Machine[J]. Computer Simulation, 2013, 30(2): 299-302.)
doi: 10.3969/j.issn.1006-9348.2013.02.069
(Li Yujian, Wang Ying, Leng Qiangkui.Two-class Text Categorization Using Nearest Subspace Search[J]. Computer Engineering and Science, 2015, 37(1): 168-172.)
doi: 10.3969/j.issn.1007-130X.2015.01.026
(Lv Chaozhen, Ji Donghong, Wu Feifei.Short Text Classification Based on LDA Feature Extension[J]. Computer Engineering and Applications, 2015, 51(4): 123-127.)
doi: 10.3778/j.issn.1002-8331.1403-0448
(Guo Dongliang, Liu Xiaoming, Zheng Qiusheng.Internet Short-text Classification Method Based on CNNs[J]. Computer and Modernization, 2017(4): 78-81.)
doi: 10.3969/j.issn.1006-2475.2017.04.016