Please wait a minute...
Advanced Search
数据分析与知识发现  2024, Vol. 8 Issue (5): 38-45     https://doi.org/10.11925/infotech.2096-3467.2023.0099
  研究论文 本期目录 | 过刊浏览 | 高级检索 |
基于双图神经网络的先序关系挖掘*
徐国兰1,白如江2()
1山东理工大学图书馆 淄博 255000
2山东理工大学信息管理学院 淄博 255000
Learning with Dual-graph for Concept Prerequisite Discovering
Xu Guolan1,Bai Rujiang2()
1Shandong University of Technology Library, Zibo 255000, China
2School of Information Management, Shandong University of Technology, Zibo 255000, China
全文: PDF (774 KB)   HTML ( 26
输出: BibTeX | EndNote (RIS)      
摘要 

【目的】 充分利用概念在学习资源中的提及等细粒度信息,更有效地进行先序关系挖掘。【方法】 利用双图神经网络进行先序关系挖掘。根据概念与学习资源之间的联系以及概念之间的先序关系分别建立概念语义图和概念先序图。使用图神经网络对其学习,得到概念的表示并用于未知先序关系预测。【结果】 通过在4个经典先序关系挖掘数据集上进行大量的实验,本文方法取得了较好的结果,并超过了现有的方法,在F1指标上分别超过次优方法0.059、0.037、0.073、0.042。【局限】 本文方法对在学习资源中有明确提及的概念有较强的先序关系挖掘能力,而对未在学习资源中出现过的概念预测能力较弱。【结论】 双图神经网络方法能够充分利用学习资源中的语义信息,提升先序关系挖掘能力。

服务
把本文推荐给朋友
加入引用管理器
E-mail Alert
RSS
作者相关文章
徐国兰
白如江
关键词 先序关系挖掘图神经网络智慧教育    
Abstract

[Objective] This paper fully utilizes fine-grained information, such as the mention of concepts in learning resources, to more effectively identify prerequisite relationships. [Methods] First, we explored prerequisite relationships using a dual-graph neural network. Then, we constructed a concept semantic graph and a concept prerequisite graph based on the connections between learning resources and concepts. Third, we obtained the representations of concepts with a graph neural network and predicted the unknown prerequisite relationships. [Results] We extensively examined our model on four classic prerequisite relationship mining datasets. Our method achieved promising results, surpassing existing methods. It outperformed the second-best method by 0.059, 0.037, 0.073, and 0.042 regarding the F1 score on each dataset. [Limitations] This method shows weak predictive ability for concepts not appearing in the learning resources. [Conclusions] The proposed dual-graph neural network method can effectively leverage semantic information in learning resources to enhance prerequisite relationship mining.

Key wordsPrerequisite Discovering    Graph Neural Networks    Smart Education
收稿日期: 2023-02-14      出版日期: 2023-05-16
ZTFLH:  TP391  
基金资助:*国家社会科学基金项目的研究成果之一(21BTQ071)
通讯作者: 白如江,ORCID: 0000-0003-3822-8484, E-mail: brj@sdut.edu.cn。   
引用本文:   
徐国兰, 白如江. 基于双图神经网络的先序关系挖掘*[J]. 数据分析与知识发现, 2024, 8(5): 38-45.
Xu Guolan, Bai Rujiang. Learning with Dual-graph for Concept Prerequisite Discovering. Data Analysis and Knowledge Discovery, 2024, 8(5): 38-45.
链接本文:  
https://manu44.magtech.com.cn/Jwk_infotech_wk3/CN/10.11925/infotech.2096-3467.2023.0099      或      https://manu44.magtech.com.cn/Jwk_infotech_wk3/CN/Y2024/V8/I5/38
Fig.1  DGPL整体架构
Fig.2  孪生网络
数据集 指标 PREREQ GAE VGAE CPRL ConLearn DGPL
MOOC DSA P 0.492 0.294 0.269 0.641 0.790 0.781
R 0.462 0.715 0.657 0.619 0.700 0.822
F1 0.476 0.417 0.382 0.630 0.741 0.800
MOOC ML P 0.448 0.293 0.266 0.800 0.831 0.843
R 0.592 0.733 0.647 0.642 0.826 0.889
F1 0.510 0.419 0.377 0.712 0.828 0.865
LectureBank P 0.590 0.462 0.417 0.861 0.852 0.898
R 0.502 0.811 0.575 0.858 0.803 0.881
F1 0.543 0.589 0.484 0.860 0.826 0.889
University Course P 0.468 0.450 0.470 0.689 0.776 0.798
R 0.916 0.886 0.694 0.760 0.782 0.846
F1 0.597 0.597 0.560 0.723 0.779 0.821
Table 1  实验结果
消融实验 F1
MOOC DSA MOOC ML LectureBank University Course
-BERT 0.693
(-0.107)
0.728
(-0.137)
0.803
(-0.086)
0.700
(-0.121)
-概念语义图 0.755
(-0.045)
0.822
(-0.043)
0.863
(-0.026)
0.788
(-0.033)
-概念先序图 0.732
(-0.068)
0.810
(-0.055)
0.861
(-0.028)
0.790
(-0.031)
-孪生网络 0.786
(-0.014)
0.852
(-0.013)
0.879
(-0.010)
0.805
(-0.016)
Table 2  消融实验
异质图
神经网络
F1
MOOC DSA MOOC ML LectureBank University Course
R-GCN 0.800 0.865 0.889 0.821
R-GAT 0.803 0.865 0.885 0.822
HAN 0.799 0.868 0.890 0.818
HGT 0.797 0.861 0.887 0.820
Table 3  异质图神经网络的影响
[1] Schlichtkrull M S, Kipf T N, Bloem P, et al. Modeling Relational Data with Graph Convolutional Networks[C]// Proceedings of the Semantic Web:the 15th International Conference. 2018: 593-607.
[2] Veličković P, Cucurull G, Casanova A, et al. Graph Attention Networks [OL]. arXiv Preprint, arXiv: 1710.10903.
[3] Pan L M, Li C J, Li J Z, et al. Prerequisite Relation Learning for Concepts in MOOCs[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers). 2017: 1447-1456.
[4] Liang C, Wu Z H, Huang W Y, et al. Measuring Prerequisite Relations Among Concepts[C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015: 1668-1674.
[5] Gordon J, Zhu L H, Galstyan A, et al. Modeling Concept Dependencies in a Scientific Corpus[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers). 2016: 866-875.
[6] Liang C, Ye J B, Wu Z H, et al. Recovering Concept Prerequisite Relations from University Course Dependencies[C]// Proceedings of the 31st AAAI Conference on Artificial Intelligence. 2017: 4786-4791.
[7] Roy S, Madhyastha M, Lawrence S, et al. Inferring Concept Prerequisite Relations from Online Educational Resources[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019: 9589-9594.
[8] Li I, Fabbri A R, Tung R R, et al. What Should I Learn First: Introducing LectureBank for NLP Education and Prerequisite Chain Learning[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019: 6674-6681.
[9] Sun H, Li Y T, Zhang Y. ConLearn: Contextual-Knowledge-Aware Concept Prerequisite Relation Learning with Graph Neural Network[C]// Proceedings of the 2022 SIAM International Conference on Data Mining. 2022: 118-126.
[10] Li I, Fabbri A, Hingmire S, et al. R-VGAE: Relational-Variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning[C]// Proceedings of the 28th International Conference on Computational Linguistics. 2020: 1147-1157.
[11] Zhang J T, Lin N Z, Zhang X L, et al. Learning Concept Prerequisite Relations from Educational Data via Multi-Head Attention Variational Graph Auto-Encoders[C]// Proceedings of the 15th International Conference on Web Search and Data Mining. 2022: 1377-1385.
[12] Jia C H, Shen Y L, Tang Y C, et al. Heterogeneous Graph Neural Networks for Concept Prerequisite Relation Learning in Educational Data[C]// Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. 2021: 2036-2047.
[13] Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, Volume 1 (Long and Short Papers). 2019: 4171-4186.
[14] Liu Y H, Ott M, Goyal N, et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach [OL]. arXiv Preprint, arXiv: 1907.11692.
[15] Artetxe M, Labaka G, Agirre E. Unsupervised Statistical Machine Translation[C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018: 3632-3642.
[16] Busbridge D, Sherburn D, Cavallo P, et al. Relational Graph Attention Networks [OL]. arXiv Preprint, arXiv: 1904.05811.
[17] Wang X, Ji H Y, Shi C, et al. Heterogeneous Graph Attention Network[C]// Proceedings of WWW’19:The World Wide Web Conference. 2019: 2022-2032.
[18] Hu Z N, Dong Y X, Wang K S, et al. Heterogeneous Graph Transformer[C]// Proceedings of WWW ’20:The World Wide Web Conference. 2020: 2704-2710.
[1] 吴越, 孙海春. 基于图神经网络的知识图谱补全研究综述*[J]. 数据分析与知识发现, 2024, 8(3): 10-28.
[2] 张雄涛, 祝娜, 郭玉慧. 基于图神经网络的会话推荐方法综述*[J]. 数据分析与知识发现, 2024, 8(2): 1-16.
[3] 许鑫, 李倩, 姚占雷. 基于图神经网络的技术识别链接预测方法研究*[J]. 数据分析与知识发现, 2023, 7(6): 15-25.
[4] 裴伟, 孙水发, 李小龙, 鲁际, 杨柳, 吴义熔. 融合领域知识的医学命名实体识别研究*[J]. 数据分析与知识发现, 2023, 7(3): 142-154.
[5] 陈昊冉, 洪亮. 融合知识关联与时序传导的金融舆情风险预测模型*[J]. 数据分析与知识发现, 2023, 7(11): 1-13.
[6] 成全, 佘德昕. 融合患者体征与用药数据的图神经网络药物推荐方法研究*[J]. 数据分析与知识发现, 2022, 6(9): 113-124.
[7] 张若琦, 申建芳, 陈平华. 结合GNN、Bi-GRU及注意力机制的会话序列推荐*[J]. 数据分析与知识发现, 2022, 6(6): 46-54.
[8] 王露, 乐小虬. 基于句法依赖增强的主题-问题实例识别方法研究[J]. 数据分析与知识发现, 2022, 6(12): 13-22.
[9] 王洁,高原,张蕾,马力文,冯筠. 基于因果分析图的城市交通流短时预测研究*[J]. 数据分析与知识发现, 2022, 6(11): 111-125.
[10] 顾耀文,郑思,杨丰春,李姣. 基于图神经网络的抗结核杆菌药物虚拟筛选模型的建立及应用*[J]. 数据分析与知识发现, 2022, 6(11): 93-102.
[11] 冯小东, 惠康欣. 基于异构图神经网络的社交媒体文本主题聚类*[J]. 数据分析与知识发现, 2022, 6(10): 9-19.
[12] 黄学坚, 刘雨飏, 马廷淮. 基于改进型图神经网络的学术论文分类模型*[J]. 数据分析与知识发现, 2022, 6(10): 93-102.
[13] 顾耀文, 张博文, 郑思, 杨丰春, 李姣. 基于图注意力网络的药物ADMET分类预测模型构建方法*[J]. 数据分析与知识发现, 2021, 5(8): 76-85.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
版权所有 © 2015 《数据分析与知识发现》编辑部
地址:北京市海淀区中关村北四环西路33号 邮编:100190
电话/传真:(010)82626611-6626,82624938
E-mail:jishu@mail.las.ac.cn