Automatic coding of students' writing via Contrastive Representation Learning in the Wasserstein space

11/26/2020
by   Ruijie Jiang, et al.
0

Qualitative analysis of verbal data is of central importance in the learning sciences. It is labor-intensive and time-consuming, however, which limits the amount of data researchers can include in studies. This work is a step towards building a statistical machine learning (ML) method for achieving an automated support for qualitative analyses of students' writing, here specifically in score laboratory reports in introductory biology for sophistication of argumentation and reasoning. We start with a set of lab reports from an undergraduate biology course, scored by a four-level scheme that considers the complexity of argument structure, the scope of evidence, and the care and nuance of conclusions. Using this set of labeled data, we show that a popular natural language modeling processing pipeline, namely vector representation of words, a.k.a word embeddings, followed by Long Short Term Memory (LSTM) model for capturing language generation as a state-space model, is able to quantitatively capture the scoring, with a high Quadratic Weighted Kappa (QWK) prediction score, when trained in via a novel contrastive learning set-up. We show that the ML algorithm approached the inter-rater reliability of human analysis. Ultimately, we conclude, that machine learning (ML) for natural language processing (NLP) holds promise for assisting learning sciences researchers in conducting qualitative studies at much larger scales than is currently possible.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2021

Interpretable contrastive word mover's embedding

This paper shows that a popular approach to the supervised embedding of ...
research
03/30/2020

QRMine: A python package for triangulation in Grounded Theory

Grounded theory (GT) is a qualitative research method for building theor...
research
10/28/2020

A Comprehensive Survey on Word Representation Models: From Classical to State-Of-The-Art Word Representation Language Models

Word representation has always been an important research area in the hi...
research
08/16/2023

PEvoLM: Protein Sequence Evolutionary Information Language Model

With the exponential increase of the protein sequence databases over tim...
research
09/19/2021

Adversarial Training with Contrastive Learning in NLP

For years, adversarial training has been extensively studied in natural ...
research
11/29/2022

Democratizing Machine Learning for Interdisciplinary Scholars: Report on Organizing the NLP+CSS Online Tutorial Series

Many scientific fields – including biology, health, education, and the s...
research
10/21/2019

Two Case Studies of Experience Prototyping Machine Learning Systems in the Wild

Throughout the course of my Ph.D., I have been designing the user experi...

Please sign up or login with your details

Forgot password? Click here to reset