1School of Management, Xi’an Polytechnic University, Xi’an 710048, China 2School of Science, Xi’an Polytechnic University, Xi’an 710048, China 3School of Journalism and New Media, Xi’an Jiaotong University, Xi’an 710049, China
[Objective] This paper focused on Chinese text sentiment classification based on deep belief network, especially the parameter selection and performance analysis of the network. [Methods] Chinese e-commercial reviews are as the object of the study, the unigram, bigram, POS, simple dependency label, sentiment score and triple dependency features are extracted and used as the input of deep belief network by setting different layers and different input numbers to compute the accuracy of sentiment classification. [Results] The results demonstrate that the triple dependency features as the input got better classification performance than the other features, but the number of hidden layers doesn’t have an effect on the classification accuracy. [Limitations] The methods aren’t conducted and verified on other deep learning models. [Conclusions] Deep learning has a good performance for sentiment analysis, but how to set up parameters still need to be further considered.
张庆庆,贺兴时,王慧敏,蒙胜军. 基于深度信念网络的文本情感分类研究*[J]. 数据分析与知识发现, 2019, 3(4): 71-79.
Qingqing Zhang,Xingshi He,Huimin Wang,Shengjun Meng. Text Sentiment Classification Based on Deep Belief Network. Data Analysis and Knowledge Discovery, DOI：10.11925/infotech.2096-3467.2018.0516.
(Zhang Qingqing, Liu Xilin.Sentiment Analysis Based on Dependency Sytactic Relation[J]. Computer Engineering and Applications, 2015, 51(22): 28-32.)
Balahur A, Turchi M.Comparative Experiments Using Supervised Learning and Machine Translation for Multilingual Sentiment Analysis[J]. Computer Speech & Language, 2014, 28(1): 56-75.
Mleczko W K, Kapuscinski T, Nowicki R K.Rough Deep Belief Network - Application to Incomplete Handwritten Digits Pattern Classification[J]. Information and Software Technologies, 2015, 538: 400-411.
Zhao Q N, Ma J J, Gong M G, et al.Three-Class Change Detection in Synthetic Aperture Radar Images Based on Deep Belief Network[J]. Journal of Computational and Theoretical Nanoscience, 2016, 13(6): 3757-3762.
Dahl G E, Yu D, Deng L, et al.Context-Dependent Pre-Trained Deep Neural Networks for Large-Vocabulary Speech Recognition[J]. IEEE Transactions on Audio, Speech, and Language Processing, 2012, 20(1): 30-42.
Zhou G, Zeng Z, Huang J X, et al.Transfer Learning for Cross-Lingual Sentiment Classification with Weakly Shared Deep Neural Networks[C]// Proceedings of the 39th International ACM SIGIR Conference on Research and Development in Information Retrieval, Pisa, Italy. New York, USA: ACM, 2016: 245-254.
Mikolv T, Chen K, Corrado G, et al. Efficient Estimation of Word Representation in Vector Space[OL]. arXiv Preprint, arXiv:1301.3781, 2013.
Ruangkanokman P, Achalakul T, Akkarajitsakul K.Deep Belief Networks with Feature Selection for Sentiment Classification[C]// Proceedings of the 7th International Conference on Intelligent Systems, Modelling and Simulation. 2016: 25-27.
Hinton G E, Salakhutdinov R R.Reducing the Dimensionality of Data with Neural Networks[J]. Science, 2006, 313(5786): 504-507.
Zeng N, Wang Z, Zhang H, et al.Deep Belief Networks for Quantitative Analysis of a Gold Immunochromatograpghic Strip[J]. Cognitive Computation, 2016, 8(4): 684-692.
Hinton G E.A Practical Guide to Training Restricted Boltzmann Machines[J]. Neural Networks: Tricks of the Trade, 2012, 7700: 599-619.
Hinton G E.Training Products of Experts by Minimizing Contrastive Divergence[J]. Neural Computation, 2002, 14(8): 1771-1800.
Rumelhart D E .Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1: Foundations[J]. Language, 1986, 63(4): 45-76.