DeepAI AI Chat
Log In Sign Up

SAINT+: Integrating Temporal Features for EdNet Correctness Prediction

by   Dongmin Shin, et al.

We propose SAINT+, a successor of SAINT which is a Transformer based knowledge tracing model that separately processes exercise information and student response information. Following the architecture of SAINT, SAINT+ has an encoder-decoder structure where the encoder applies self-attention layers to a stream of exercise embeddings, and the decoder alternately applies self-attention layers and encoder-decoder attention layers to streams of response embeddings and encoder output. Moreover, SAINT+ incorporates two temporal feature embeddings into the response embeddings: elapsed time, the time taken for a student to answer, and lag time, the time interval between adjacent learning activities. We empirically evaluate the effectiveness of SAINT+ on EdNet, the largest publicly available benchmark dataset in the education domain. Experimental results show that SAINT+ achieves state-of-the-art performance in knowledge tracing with an improvement of 1.25 in area under receiver operating characteristic curve compared to SAINT, the current state-of-the-art model in EdNet dataset.


Towards an Appropriate Query, Key, and Value Computation for Knowledge Tracing

Knowledge tracing, the act of modeling a student's knowledge through lea...

MUSE: Multi-Scale Temporal Features Evolution for Knowledge Tracing

Transformer based knowledge tracing model is an extensively studied prob...

Cross-Enhancement Transformer for Action Segmentation

Temporal convolutions have been the paradigm of choice in action segment...

Last Query Transformer RNN for knowledge tracing

This paper presents an efficient model to predict a student's answer cor...

Double Path Networks for Sequence to Sequence Learning

Encoder-decoder based Sequence to Sequence learning (S2S) has made remar...

Action Forecasting with Feature-wise Self-Attention

We present a new architecture for human action forecasting from videos. ...

An Improved End-to-End Multi-Target Tracking Method Based on Transformer Self-Attention

This study proposes an improved end-to-end multi-target tracking algorit...