Please wait a minute...
Data Analysis and Knowledge Discovery  2024, Vol. 8 Issue (5): 38-45    DOI: 10.11925/infotech.2096-3467.2023.0099
Current Issue | Archive | Adv Search |
Learning with Dual-graph for Concept Prerequisite Discovering
Xu Guolan1,Bai Rujiang2()
1Shandong University of Technology Library, Zibo 255000, China
2School of Information Management, Shandong University of Technology, Zibo 255000, China
Download: PDF (774 KB)   HTML ( 26
Export: BibTeX | EndNote (RIS)      
Abstract  

[Objective] This paper fully utilizes fine-grained information, such as the mention of concepts in learning resources, to more effectively identify prerequisite relationships. [Methods] First, we explored prerequisite relationships using a dual-graph neural network. Then, we constructed a concept semantic graph and a concept prerequisite graph based on the connections between learning resources and concepts. Third, we obtained the representations of concepts with a graph neural network and predicted the unknown prerequisite relationships. [Results] We extensively examined our model on four classic prerequisite relationship mining datasets. Our method achieved promising results, surpassing existing methods. It outperformed the second-best method by 0.059, 0.037, 0.073, and 0.042 regarding the F1 score on each dataset. [Limitations] This method shows weak predictive ability for concepts not appearing in the learning resources. [Conclusions] The proposed dual-graph neural network method can effectively leverage semantic information in learning resources to enhance prerequisite relationship mining.

Key wordsPrerequisite Discovering      Graph Neural Networks      Smart Education     
Received: 14 February 2023      Published: 16 May 2023
ZTFLH:  TP391  
Fund:National Social Science Fund of China(21BTQ071)
Corresponding Authors: Bai Rujiang,ORCID: 0000-0003-3822-8484, E-mail: brj@sdut.edu.cn。   

Cite this article:

Xu Guolan, Bai Rujiang. Learning with Dual-graph for Concept Prerequisite Discovering. Data Analysis and Knowledge Discovery, 2024, 8(5): 38-45.

URL:

https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/10.11925/infotech.2096-3467.2023.0099     OR     https://manu44.magtech.com.cn/Jwk_infotech_wk3/EN/Y2024/V8/I5/38

The Overview of DGPL
Siamese Network
数据集 指标 PREREQ GAE VGAE CPRL ConLearn DGPL
MOOC DSA P 0.492 0.294 0.269 0.641 0.790 0.781
R 0.462 0.715 0.657 0.619 0.700 0.822
F1 0.476 0.417 0.382 0.630 0.741 0.800
MOOC ML P 0.448 0.293 0.266 0.800 0.831 0.843
R 0.592 0.733 0.647 0.642 0.826 0.889
F1 0.510 0.419 0.377 0.712 0.828 0.865
LectureBank P 0.590 0.462 0.417 0.861 0.852 0.898
R 0.502 0.811 0.575 0.858 0.803 0.881
F1 0.543 0.589 0.484 0.860 0.826 0.889
University Course P 0.468 0.450 0.470 0.689 0.776 0.798
R 0.916 0.886 0.694 0.760 0.782 0.846
F1 0.597 0.597 0.560 0.723 0.779 0.821
Experimental Result
消融实验 F1
MOOC DSA MOOC ML LectureBank University Course
-BERT 0.693
(-0.107)
0.728
(-0.137)
0.803
(-0.086)
0.700
(-0.121)
-概念语义图 0.755
(-0.045)
0.822
(-0.043)
0.863
(-0.026)
0.788
(-0.033)
-概念先序图 0.732
(-0.068)
0.810
(-0.055)
0.861
(-0.028)
0.790
(-0.031)
-孪生网络 0.786
(-0.014)
0.852
(-0.013)
0.879
(-0.010)
0.805
(-0.016)
Ablation Experiment
异质图
神经网络
F1
MOOC DSA MOOC ML LectureBank University Course
R-GCN 0.800 0.865 0.889 0.821
R-GAT 0.803 0.865 0.885 0.822
HAN 0.799 0.868 0.890 0.818
HGT 0.797 0.861 0.887 0.820
Influence of Heterogeneous Graph Neural Networks
[1] Schlichtkrull M S, Kipf T N, Bloem P, et al. Modeling Relational Data with Graph Convolutional Networks[C]// Proceedings of the Semantic Web:the 15th International Conference. 2018: 593-607.
[2] Veličković P, Cucurull G, Casanova A, et al. Graph Attention Networks [OL]. arXiv Preprint, arXiv: 1710.10903.
[3] Pan L M, Li C J, Li J Z, et al. Prerequisite Relation Learning for Concepts in MOOCs[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers). 2017: 1447-1456.
[4] Liang C, Wu Z H, Huang W Y, et al. Measuring Prerequisite Relations Among Concepts[C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015: 1668-1674.
[5] Gordon J, Zhu L H, Galstyan A, et al. Modeling Concept Dependencies in a Scientific Corpus[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers). 2016: 866-875.
[6] Liang C, Ye J B, Wu Z H, et al. Recovering Concept Prerequisite Relations from University Course Dependencies[C]// Proceedings of the 31st AAAI Conference on Artificial Intelligence. 2017: 4786-4791.
[7] Roy S, Madhyastha M, Lawrence S, et al. Inferring Concept Prerequisite Relations from Online Educational Resources[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019: 9589-9594.
[8] Li I, Fabbri A R, Tung R R, et al. What Should I Learn First: Introducing LectureBank for NLP Education and Prerequisite Chain Learning[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019: 6674-6681.
[9] Sun H, Li Y T, Zhang Y. ConLearn: Contextual-Knowledge-Aware Concept Prerequisite Relation Learning with Graph Neural Network[C]// Proceedings of the 2022 SIAM International Conference on Data Mining. 2022: 118-126.
[10] Li I, Fabbri A, Hingmire S, et al. R-VGAE: Relational-Variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning[C]// Proceedings of the 28th International Conference on Computational Linguistics. 2020: 1147-1157.
[11] Zhang J T, Lin N Z, Zhang X L, et al. Learning Concept Prerequisite Relations from Educational Data via Multi-Head Attention Variational Graph Auto-Encoders[C]// Proceedings of the 15th International Conference on Web Search and Data Mining. 2022: 1377-1385.
[12] Jia C H, Shen Y L, Tang Y C, et al. Heterogeneous Graph Neural Networks for Concept Prerequisite Relation Learning in Educational Data[C]// Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. 2021: 2036-2047.
[13] Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, Volume 1 (Long and Short Papers). 2019: 4171-4186.
[14] Liu Y H, Ott M, Goyal N, et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach [OL]. arXiv Preprint, arXiv: 1907.11692.
[15] Artetxe M, Labaka G, Agirre E. Unsupervised Statistical Machine Translation[C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018: 3632-3642.
[16] Busbridge D, Sherburn D, Cavallo P, et al. Relational Graph Attention Networks [OL]. arXiv Preprint, arXiv: 1904.05811.
[17] Wang X, Ji H Y, Shi C, et al. Heterogeneous Graph Attention Network[C]// Proceedings of WWW’19:The World Wide Web Conference. 2019: 2022-2032.
[18] Hu Z N, Dong Y X, Wang K S, et al. Heterogeneous Graph Transformer[C]// Proceedings of WWW ’20:The World Wide Web Conference. 2020: 2704-2710.
[1] Chen Haoran, Hong Liang. Financial Public Opinion Risk Prediction Model Integrating Knowledge Association and Temporal Transmission[J]. 数据分析与知识发现, 2023, 7(11): 1-13.
[2] Feng Xiaodong, Hui Kangxin. Topic Clustering for Social Media Texts with Heterogeneous Graph Neural Networks[J]. 数据分析与知识发现, 2022, 6(10): 9-19.
  Copyright © 2016 Data Analysis and Knowledge Discovery   Tel/Fax:(010)82626611-6626,82624938   E-mail:jishu@mail.las.ac.cn