Bi-CLKT: Bi-Graph Contrastive Learning based Knowledge Tracing

01/22/2022
by   XiangYu Song, et al.
0

The goal of Knowledge Tracing (KT) is to estimate how well students have mastered a concept based on their historical learning of related exercises. The benefit of knowledge tracing is that students' learning plans can be better organised and adjusted, and interventions can be made when necessary. With the recent rise of deep learning, Deep Knowledge Tracing (DKT) has utilised Recurrent Neural Networks (RNNs) to accomplish this task with some success. Other works have attempted to introduce Graph Neural Networks (GNNs) and redefine the task accordingly to achieve significant improvements. However, these efforts suffer from at least one of the following drawbacks: 1) they pay too much attention to details of the nodes rather than to high-level semantic information; 2) they struggle to effectively establish spatial associations and complex structures of the nodes; and 3) they represent either concepts or exercises only, without integrating them. Inspired by recent advances in self-supervised learning, we propose a Bi-Graph Contrastive Learning based Knowledge Tracing (Bi-CLKT) to address these limitations. Specifically, we design a two-layer contrastive learning scheme based on an "exercise-to-exercise" (E2E) relational subgraph. It involves node-level contrastive learning of subgraphs to obtain discriminative representations of exercises, and graph-level contrastive learning to obtain discriminative representations of concepts. Moreover, we designed a joint contrastive loss to obtain better representations and hence better prediction performance. Also, we explored two different variants, using RNN and memory-augmented neural networks as the prediction layer for comparison to obtain better representations of exercises and concepts respectively. Extensive experiments on four real-world datasets show that the proposed Bi-CLKT and its variants outperform other baseline models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/20/2021

Towards Graph Self-Supervised Learning with Contrastive Adjusted Zooming

Graph representation learning (GRL) is critical for graph-structured dat...
research
10/07/2022

SAICL: Student Modelling with Interaction-level Auxiliary Contrastive Tasks for Knowledge Tracing and Dropout Prediction

Knowledge tracing and dropout prediction are crucial for online educatio...
research
12/08/2022

Graph Matching with Bi-level Noisy Correspondence

In this paper, we study a novel and widely existing problem in graph mat...
research
10/18/2022

DAGKT: Difficulty and Attempts Boosted Graph-based Knowledge Tracing

In the field of intelligent education, knowledge tracing (KT) has attrac...
research
04/27/2022

SCGC : Self-Supervised Contrastive Graph Clustering

Graph clustering discovers groups or communities within networks. Deep l...
research
12/23/2020

Motif-Driven Contrastive Learning of Graph Representations

Graph motifs are significant subgraph patterns occurring frequently in g...
research
01/27/2021

On the Interpretability of Deep Learning Based Models for Knowledge Tracing

Knowledge tracing allows Intelligent Tutoring Systems to infer which top...

Please sign up or login with your details

Forgot password? Click here to reset