|
|
A Text Vector Representation Model Merging Multi-Granularity Information |
Weimin Nie,Yongzhou Chen,Jing Ma( ) |
College of Economics and Management, Nanjing University of Aeronautics and Astronautics, Nanjing 211106, China |
|
|
Abstract [Objective] This paper proposed a model to extract semantic features from texts more comprehensively and to improve the representation of semantics by text vectors. [Methods] We obtained the word-granularity, topic-granularity and character-granularity feature vectors with the help of convolutional neural networks. Then, the three feature vectors were combined by the “merging gate” mechanism to generate the final text vectors. Finally, we examined the model with text classification experiment. [Results] The accuracy (92.56%), the precision (92.33%), the recall (92.07%) and the F-score (92.20%), were 2.40%, 2.05%, 1.77% and 1.91% higher than the results of Text-CNN. [Limitations] The Long-distance dependency features need to be included and the corpus size needs to be expanded. [Conclusions] The proposed model could better represent the text semantics.
|
Received: 19 October 2018
Published: 23 October 2019
|
|
[1] |
宗成庆 . 统计自然语言处理[M]. 第2版. 北京: 清华大学出版社, 2013: 416-419.
|
[1] |
( Zong Chengqing. Statistical Natural Language Processing[M]. The 2nd Edition. Beijing: Tsinghua University Press, 2013: 416-419.)
|
[2] |
芮伟康 . 基于语义的文本向量表示方法研究[D]. 合肥: 中国科学技术大学, 2017.
|
[2] |
( Rui Weikang . A Research on Text Vector Representation Based on Semantics[D]. Hefei: University of Science and Technology of China, 2017.)
|
[3] |
牛力强 . 基于神经网络的文本向量表示与建模研究[D]. 南京: 南京大学, 2016.
|
[3] |
( Niu Liqiang . A Research on Text Vector Representations and Modelling Based on Neural Networks[D]. Nanjing: Nanjing University, 2016.)
|
[4] |
Salton G, Wong A, Yang C S . A Vector Space Model for Automatic Indexing[J]. Communications of the ACM, 1975,18(11):613-620.
|
[5] |
Blei D M, Ng A Y, Jordan M I . Latent Dirichlet Allocation[J]. Journal of Machine Learning Research, 2003,3:993-1022.
|
[6] |
姚全珠, 宋志理, 彭程 . 基于LDA模型的文本分类研究[J]. 计算机工程与应用, 2011,47(13):150-153.
|
[6] |
( Yao Quanzhu, Song Zhili, Peng Cheng . Research on Text Categorization Based on LDA[J]. Computer Engineering and Applications, 2011,47(13):150-153.)
|
[7] |
徐艳华, 苗雨洁, 苗琳 , 等. 基于LDA模型的HSK作文生成[J]. 数据分析与知识发现, 2018,2(9):80-87.
|
[7] |
( Xu Yanhua, Miao Yujie, Miao Lin , et al. Generating HSK Writing Essays with LDA Model[J]. Data Analysis and Knowledge Discovery, 2018,2(9):80-87.)
|
[8] |
Kim Y, Shim K . TWILITE: A Recommendation System for Twitter Using a Probabilistic Model Based on Latent Dirichlet Allocation[J]. Information Systems, 2014,42:59-77.
|
[9] |
Mikolov T, Sutskever I, Chen K, et al. Distributed Representations of Words and Phrases and Their Compositionality [C]// Proceedings of the Neural Information Processing Systems 2013. 2013.
|
[10] |
Mikolov T, Chen K, Corrado G , et al. Efficient Estimation of Word Representations in Vector Space[OL]. arXiv Preprint, arXiv: 1301.3781.
|
[11] |
Tang D, Qin B, Liu T . Aspect Level Sentiment Classification with Deep Memory Network[OL]. arXiv Preprint, arXiv: 1605.08900.
|
[12] |
杜慧, 徐学可, 伍大勇 , 等. 基于情感词向量的微博情感分类[J]. 中文信息学报, 2017,31(3):170-176.
|
[12] |
( Du Hui, Xu Xueke, Wu Dayong , et al. A Sentiment Classification Method Based on Sentiment-Specific Word Embedding[J]. Journal of Chinese Information Processing, 2017,31(3):170-176.)
|
[13] |
李心蕾, 王昊, 刘小敏 , 等. 面向微博短文本分类的文本向量化方法比较研究[J]. 数据分析与知识发现, 2018,2(8):41-50.
|
[13] |
( Li Xinlei, Wang Hao, Liu Xiaomin , et al. Comparing Text Vector Generators for Weibo Short Text Classification[J]. Data Analysis and Knowledge Discovery, 2018,2(8):41-50.)
|
[14] |
LeCun Y, Bengio Y . Convolutional Networks for Images, Speech, and Time Series[J]. The Handbook of Brain Theory and Neural Networks, 1995: 3361.
|
[15] |
Deng L, Liu Y . Deep Learning in Natural Language Processing[M]. Singapore: Springer Singapore, 2018: 226-229.
|
[16] |
Collobert R, Weston J, Bottou L , et al. Natural Language Processing (Almost) from Scratch[J]. Journal of Machine Learning Research, 2011,12:2493-2537.
|
[17] |
Lei T, Barzilay R, Jaakkola T. Molding CNNs for Text: Non-linear, Non-consecutive Convolutions [C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal. 2015.
|
[18] |
Zhang Y, Roller S, Wallace B C. MGNC-CNN: A Simple Approach to Exploiting Multiple Word Embeddings for Sentence Classification [C]// Proceedings of the 2016 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, San Diego, California, USA. Stroudsburg, Pennsylvania, USA: Association for Computational Linguistics, 2016: 1522-1527.
|
[19] |
Yin W, Kann K, Yu M , et al. Comparative Study of CNN and RNN for Natural Language Processing[OL]. arXiv Preprint, arXiv: 1702.01923.
|
[20] |
Dos Santos C, Gatti M. Deep Convolutional Neural Networks for Sentiment Analysis of Short Texts [C]// Proceedings of the 25th International Conference on Computational Linguistics, Dublin, Ireland. Dublin, Ireland: Dublin City University and Association for Computational Linguistics, 2014: 69-78.
|
[21] |
Zhang X, Zhao J, LeCun Y . Character-Level Convolutional Networks for Text Classification [C]// Proceedings of the 2015 Neural Information Processing Systems. 2015.
|
[22] |
余本功, 张连彬 . 基于CP-CNN的中文短文本分类研究[J]. 计算机应用研究, 2018,35(4):1001-1004.
|
[22] |
( Yu Bengong, Zhang Lianbin . Chinese Short Text Classification Based on CP-CNN[J]. Application Research of Computers, 2018,35(4):1001-1004.)
|
[23] |
Zheng X, Chen H, Xu T. Deep Learning for Chinese Word Segmentation and POS Tagging [C]// Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, Seattle, Washington, USA. Stroudsburg, Pennsylvania, USA: Association for Computational Linguistics, 2013: 647-657.
|
[24] |
王毅, 谢娟, 成颖 . 结合LSTM和CNN混合架构的深度神经网络语言模型[J]. 情报学报, 2018,37(2):194-205.
|
[24] |
( Wang Yi, Xie Juan, Cheng Ying . Deep Neural Networks Language Model Based on CNN and LSTM Hybrid Architecture[J]. Journal of the China Society for Scientific and Technical Information, 2018,37(2):194-205.)
|
[25] |
Cho K, Van Merrienboer B, Gulcehre C , et al. Learning Phrase Representations Using RNN Encoder-Decoder for Statistical Machine Translation[OL]. arXiv Preprint, arXiv: 1406.1078.
|
[26] |
Wang C, Zhang M, Ma S, et al. Automatic Online News Issue Construction in Web Environment [C]// Proceedings of the 17th International Conference on World Wide Web, Beijing, China. New York, USA: Association for Computational Linguistics, 2008: 457-466.
|
[27] |
“结巴”中文分词: 做最好的Python中文分词组件[EB/OL]. (2017-08-28). [2018-12-25]. https://pypi.org/project/jieba/.
|
[27] |
( “Jieba” Chinese Text Segmentation: Built to be the Best Python Chinese Word Segmentation Module[EB/OL]. (2017-08-28). [2018-12-25]. https://pypi.org/project/jieba/.)
|
[28] |
中文数据预处理材料[EB/OL]. [2018-12-25].https://github.com/foowaa/Chinese_from_dongxiexidian.
|
[28] |
( Chinese Data Preprocessing Material[EB/OL]. [2018-12-25].https://github.com/foowaa/Chinese_from_dongxiexidian.)
|
[29] |
Pedregosa F, Varoquaux G, Gramfort A , et al. Scikit-Learn: Machine Learning in Python[J]. Journal of Machine Learning Research, 2011,12:2825-2830.
|
[30] |
Phan X H, Nguyen C T . GibbsLDA++: A C/C++ Implementation of Latent Dirichlet Allocation (LDA)[EB/OL]. [ 2018- 12- 25]. http://gibbslda.sourceforge.net/.
|
[31] |
Řehůřek R, Sojka P. Software Framework for Topic Modelling with Large Corpora [C]// Proceedings of the LREC 2010 Workshop on New Challenges for NLP Frameworks, Valletta, Malta. Luxembourg: European Language Resources Association, 2010: 45-50.
|
[32] |
Abadi M, Agarwal A, Barham P , et al. TensorFlow: Large-scale Machine Learning on Heterogeneous Systems[EB/OL]. [2018-12-25].https://www.usenix.org/system/files/conference/osdi16/osdi16-abadi.pdf.
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|