MOOCRep: A Unified Pre-trained Embedding of MOOC Entities

07/12/2021
by   Shalini Pandey, et al.
0

Many machine learning models have been built to tackle information overload issues on Massive Open Online Courses (MOOC) platforms. These models rely on learning powerful representations of MOOC entities. However, they suffer from the problem of scarce expert label data. To overcome this problem, we propose to learn pre-trained representations of MOOC entities using abundant unlabeled data from the structure of MOOCs which can directly be applied to the downstream tasks. While existing pre-training methods have been successful in NLP areas as they learn powerful textual representation, their models do not leverage the richer information about MOOC entities. This richer information includes the graph relationship between the lectures, concepts, and courses along with the domain knowledge about the complexity of a concept. We develop MOOCRep, a novel method based on Transformer language model trained with two pre-training objectives : 1) graph-based objective to capture the powerful signal of entities and relations that exist in the graph, and 2) domain-oriented objective to effectively incorporate the complexity level of concepts. Our experiments reveal that MOOCRep's embeddings outperform state-of-the-art representation learning methods on two tasks important for education community, concept pre-requisite prediction and lecture recommendation.

READ FULL TEXT
research
05/27/2019

QuesNet: A Unified Representation for Heterogeneous Test Questions

Understanding learning materials (e.g. test questions) is a crucial issu...
research
05/20/2019

Enriching Pre-trained Language Model with Entity Information for Relation Classification

Relation classification is an important NLP task to extract relations be...
research
05/02/2023

BrainNPT: Pre-training of Transformer networks for brain network classification

Deep learning methods have advanced quickly in brain imaging analysis ov...
research
09/12/2019

UER: An Open-Source Toolkit for Pre-training Models

Existing works, including ELMO and BERT, have revealed the importance of...
research
05/24/2021

One4all User Representation for Recommender Systems in E-commerce

General-purpose representation learning through large-scale pre-training...
research
07/08/2021

Graph Neural Pre-training for Enhancing Recommendations using Side Information

Leveraging the side information associated with entities (i.e. users and...
research
06/23/2020

Attentional Graph Convolutional Networks for Knowledge Concept Recommendation in MOOCs in a Heterogeneous View

Massive open online courses are becoming a modish way for education, whi...

Please sign up or login with your details

Forgot password? Click here to reset