|
|
Learning with Dual-graph for Concept Prerequisite Discovering |
Xu Guolan1,Bai Rujiang2() |
1Shandong University of Technology Library, Zibo 255000, China 2School of Information Management, Shandong University of Technology, Zibo 255000, China |
|
|
Abstract [Objective] This paper fully utilizes fine-grained information, such as the mention of concepts in learning resources, to more effectively identify prerequisite relationships. [Methods] First, we explored prerequisite relationships using a dual-graph neural network. Then, we constructed a concept semantic graph and a concept prerequisite graph based on the connections between learning resources and concepts. Third, we obtained the representations of concepts with a graph neural network and predicted the unknown prerequisite relationships. [Results] We extensively examined our model on four classic prerequisite relationship mining datasets. Our method achieved promising results, surpassing existing methods. It outperformed the second-best method by 0.059, 0.037, 0.073, and 0.042 regarding the F1 score on each dataset. [Limitations] This method shows weak predictive ability for concepts not appearing in the learning resources. [Conclusions] The proposed dual-graph neural network method can effectively leverage semantic information in learning resources to enhance prerequisite relationship mining.
|
Received: 14 February 2023
Published: 16 May 2023
|
|
Fund:National Social Science Fund of China(21BTQ071) |
Corresponding Authors:
Bai Rujiang,ORCID: 0000-0003-3822-8484, E-mail: brj@sdut.edu.cn。
|
[1] |
Schlichtkrull M S, Kipf T N, Bloem P, et al. Modeling Relational Data with Graph Convolutional Networks[C]// Proceedings of the Semantic Web:the 15th International Conference. 2018: 593-607.
|
[2] |
Veličković P, Cucurull G, Casanova A, et al. Graph Attention Networks [OL]. arXiv Preprint, arXiv: 1710.10903.
|
[3] |
Pan L M, Li C J, Li J Z, et al. Prerequisite Relation Learning for Concepts in MOOCs[C]// Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers). 2017: 1447-1456.
|
[4] |
Liang C, Wu Z H, Huang W Y, et al. Measuring Prerequisite Relations Among Concepts[C]// Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing. 2015: 1668-1674.
|
[5] |
Gordon J, Zhu L H, Galstyan A, et al. Modeling Concept Dependencies in a Scientific Corpus[C]// Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics (Volume 1:Long Papers). 2016: 866-875.
|
[6] |
Liang C, Ye J B, Wu Z H, et al. Recovering Concept Prerequisite Relations from University Course Dependencies[C]// Proceedings of the 31st AAAI Conference on Artificial Intelligence. 2017: 4786-4791.
|
[7] |
Roy S, Madhyastha M, Lawrence S, et al. Inferring Concept Prerequisite Relations from Online Educational Resources[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019: 9589-9594.
|
[8] |
Li I, Fabbri A R, Tung R R, et al. What Should I Learn First: Introducing LectureBank for NLP Education and Prerequisite Chain Learning[C]// Proceedings of the 33rd AAAI Conference on Artificial Intelligence. 2019: 6674-6681.
|
[9] |
Sun H, Li Y T, Zhang Y. ConLearn: Contextual-Knowledge-Aware Concept Prerequisite Relation Learning with Graph Neural Network[C]// Proceedings of the 2022 SIAM International Conference on Data Mining. 2022: 118-126.
|
[10] |
Li I, Fabbri A, Hingmire S, et al. R-VGAE: Relational-Variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning[C]// Proceedings of the 28th International Conference on Computational Linguistics. 2020: 1147-1157.
|
[11] |
Zhang J T, Lin N Z, Zhang X L, et al. Learning Concept Prerequisite Relations from Educational Data via Multi-Head Attention Variational Graph Auto-Encoders[C]// Proceedings of the 15th International Conference on Web Search and Data Mining. 2022: 1377-1385.
|
[12] |
Jia C H, Shen Y L, Tang Y C, et al. Heterogeneous Graph Neural Networks for Concept Prerequisite Relation Learning in Educational Data[C]// Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies. 2021: 2036-2047.
|
[13] |
Devlin J, Chang M W, Lee K, et al. BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding[C]// Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, Volume 1 (Long and Short Papers). 2019: 4171-4186.
|
[14] |
Liu Y H, Ott M, Goyal N, et al. RoBERTa: A Robustly Optimized BERT Pretraining Approach [OL]. arXiv Preprint, arXiv: 1907.11692.
|
[15] |
Artetxe M, Labaka G, Agirre E. Unsupervised Statistical Machine Translation[C]// Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018: 3632-3642.
|
[16] |
Busbridge D, Sherburn D, Cavallo P, et al. Relational Graph Attention Networks [OL]. arXiv Preprint, arXiv: 1904.05811.
|
[17] |
Wang X, Ji H Y, Shi C, et al. Heterogeneous Graph Attention Network[C]// Proceedings of WWW’19:The World Wide Web Conference. 2019: 2022-2032.
|
[18] |
Hu Z N, Dong Y X, Wang K S, et al. Heterogeneous Graph Transformer[C]// Proceedings of WWW ’20:The World Wide Web Conference. 2020: 2704-2710.
|
|
Viewed |
|
|
|
Full text
|
|
|
|
|
Abstract
|
|
|
|
|
Cited |
|
|
|
|
|
Shared |
|
|
|
|
|
Discussed |
|
|
|
|