|
|
Extracting Topics and Their Relationship from College Student Mentoring |
Pang Beibei, Gou Juanqiong(), Mu Wenxin |
School of Economics and Management, Beijing Jiaotong University, Beijing 100044, China |
|
|
Abstract [Objective] This paper proposes a framework for small-scale knowledge acquisition and modeling, aiming to more effectively manage the College Students’ deep mentoring work. [Methods] Firstly, we used the LDA to identify topics of collected documents, as well as the phrases describing the topics. Secondly, we used the concept hierarchy analysis to get the relations among these topics. Finally, we encoded ontology of the modeling results for knowledge retrieval. [Results] This study further refined the granularity of topic knowledge on the basis of LDA modeling, which reduced the difficulty of topic modeling and describe their relationship. [Limitations] We did not examine the expanded knowledge base generated by the new depth mentoring documents. [Conclusions] The proposed framework supports the modeling and retrieval of multi granularity knowledge from deep counseling, such as identifying problems, communication methods, and guiding skills.
|
Received: 18 January 2018
Published: 11 July 2018
|
|
[1] |
刘云峰, 齐欢, Hu Xiang’en, 等. 基于潜在语义空间维度特性的多层文档聚类[J]. 清华大学学报: 自然科学版, 2005, 45(S1): 1783-1786.
doi: 10.3321/j.issn:1000-0054.2005.09.013
|
[1] |
(Liu Yunfeng, Qi Huan, Hu Xiang’en, et al.Multi-hierarchy Documents Clustering Based on LSA Space Dimensionality Character[J]. Journal of Tsinghua University: Science & Technology, 2005, 45(S1): 1783-1786.)
doi: 10.3321/j.issn:1000-0054.2005.09.013
|
[2] |
Lu Y, Mei Q, Zhai C.Investigating Task Performance of Probabilistic Topic Models: An Empirical Study of PLSA and LDA[J]. Information Retrieval, 2011, 14(2): 178-203.
doi: 10.1007/s10791-010-9141-9
|
[3] |
Blei D M, Ng A Y, Jordan M I.Latent Dirichlet Allocation[J]. The Journal of Machine Learning Research, 2003, 3: 993-1022.
|
[4] |
杨海霞, 高宝俊, 孙含林. 基于LDA挖掘计算机科学文献的研究主题[J]. 现代图书情报技术, 2016(11): 20-26.
|
[4] |
(Yang Haixia, Gao Baojun, Sun Hanlin.Extracting Topics of Computer Science Literature with LDA Model[J]. New Technology of Library and Information Service, 2016(11): 20-26.)
|
[5] |
胡吉明, 陈果. 基于动态LDA主题模型的内容主题挖掘与演化[J]. 图书情报工作, 2014, 58(2): 138-142.
doi: 10.13266/j.issn.0252-3116.2014.02.023
|
[5] |
(Hu Jiming, Chen Guo.Mining and Evolution of Content Topics Based on Dynamic LDA[J]. Library and Information Service, 2014, 58(2): 138-142.)
doi: 10.13266/j.issn.0252-3116.2014.02.023
|
[6] |
徐月梅, 李杨, 梁野, 等. 基于流形学习的新闻主题关系构建和演化研究[J]. 现代图书情报技术, 2016(10): 59-69.
|
[6] |
(Xu Yuemei, Li Yang, Liang Ye, et al.Analyzing Evolution of News Topics with Manifold Learning[J]. New Technology of Library and Information Service, 2016(10): 59-69.)
|
[7] |
冯佳, 张云秋. 基于LDA和本体的科学前沿识别与分析方法研究[J]. 情报理论与实践, 2017, 40(8): 49-54.
|
[7] |
(Feng Jia, Zhang Yunqiu.Research on the Method of Detecting and Analyzing Scientific Fronts Based on LDA and Ontology[J]. Information Studies: Theory & Application, 2017, 40(8): 49-54.)
|
[8] |
Rocca P D, Senatore S, Loia V.A Semantic-grained Perspective of Latent Knowledge Modeling[J]. Information Fusion, 2016, 36: 52-67.
doi: 10.1016/j.inffus.2016.11.003
|
[9] |
阮光册, 夏磊. 基于关联规则的文本主题深度挖掘应用研究[J]. 现代图书情报技术, 2016(12): 50-56.
|
[9] |
(Ruan Guangce, Xia Lei.Mining Document Topics Based on Association Rules[J]. New Technology of Library and Information Service, 2016(12): 50-56.)
|
[10] |
王红, 张昊, 史金钏. 基于LDA的领域本体概念获取方法研究[J/OL]. 计算机工程与应用. [2017-07-21]. .
|
[10] |
(Wang Hong, Zhang Hao, Shi Jinchuan. Research on Domain Ontology Concept Acquisition Method Based on LDA and Application[J/OL]. Computer Engineering and Applications. [2017-07-21].
|
[11] |
王昊, 朱惠, 邓三鸿. 基于形式概念分析的学科术语层次关系构建研究[J]. 情报学报, 2015,34(6):616-627.
doi: 10.3772/j.issn.1000-0135.2015.006.007
|
[11] |
(Wang Hao, Zhu Hui, Deng Sanhong.Study on Construction of Hierarchy Relationship of Subject Terms Based on Formal Concept Analysis[J]. Journal of the China Society for Scientific and Technical Information, 2015, 34(6): 616-627.)
doi: 10.3772/j.issn.1000-0135.2015.006.007
|
[12] |
Bloehdorn S, Cimiano P, Hotho A.Learning Ontologies to Improve Text Clustering and Classification[C]// Proceedings of the 29th Annual Conference of the Gesellschaft für Klassifikation e. V. University of Magdeburg. 2006: 334-341.
|
[13] |
王骏, 王士同, 邓赵红. 聚类分析研究中的若干问题[J]. 控制与决策, 2012, 27(3): 321-328.
|
[13] |
(Wang Jun, Wang Shitong, Deng Zhaohong.Survey on Challenges in Clustering Analysis Research[J]. Control and Decision, 2012, 27(3): 321-328.)
|
[14] |
Hwang S H, Kim H G, Yang H S.A FCA-Based Ontology Construction for the Design of Class Hierarchy[C] //Proceedings of International Conference on Computational Science and Its Applications, Singapore. 2005: 827-835.
|
[15] |
Ponzetto S P, Strube M.Deriving a Large Scale Taxonomy from Wikipedia[C]// Proceedings of the 22nd National Conference on Artificial Intelligence. AAAI Press, 2007: 1440-1445.
|
[16] |
黄承慧, 印鉴, 侯昉. 一种结合词项语义信息和TF-IDF方法的文本相似度量方法[J]. 计算机学报, 2011, 34(5): 856-864.
doi: 10.3724/SP.J.1016.2011.00856
|
[16] |
(Huang Chenghui, Yin Jian, Hou Fang.A Text Similarity Measurement Combining Word Semantic Information with TF-IDF Method[J]. Chinese Journal of Computers, 2011, 34(5): 856-864.)
doi: 10.3724/SP.J.1016.2011.00856
|
[17] |
关鹏, 王曰芬. 科技情报分析中LDA主题模型最优主题数确定方法研究[J]. 现代图书情报技术, 2016(9): 42-50.
|
[17] |
(Guan Peng, Wang Yuefen.Identifying Optimal Topic Numbers from Sci-Tech Information with LDA Model[J]. New Technology of Library and Information Service, 2016(9): 42-50.)
|
[18] |
滕广青, 毕强. 概念格构建工具ConExp与LatticeMiner的比较研究[J]. 现代图书情报技术, 2010(10): 17-22.
|
[18] |
(Teng Guangqing, Bi Qiang.Comparative Study on ConExp and LatticeMiner[J]. New Technology of Library and Information Service, 2010(10): 17-22.)
|
[19] |
Antoniou G, Harmelen F V.Web Ontology Language: OWL[A]// Handbook on Ontologies[M]. Springer Berlin Heidelberg, 2009: 67-92.
|
[20] |
Rani M, Dhar A K, Vyas O P.Semi-Automatic Terminology Ontology Learning Based on Topic Modeling[J]. Engineering Applications of Artificial Intelligence, 2017, 63: 108-125.
doi: 10.1016/j.engappai.2017.05.006
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|