Language models and Automated Essay Scoring

09/18/2019
by   Pedro Uria Rodriguez, et al.
0

In this paper, we present a new comparative study on automatic essay scoring (AES). The current state-of-the-art natural language processing (NLP) neural network architectures are used in this work to achieve above human-level accuracy on the publicly available Kaggle AES dataset. We compare two powerful language models, BERT and XLNet, and describe all the layers and network architectures in these models. We elucidate the network architectures of BERT and XLNet using clear notation and diagrams and explain the advantages of transformer architectures over traditional recurrent neural network architectures. Linear algebra notation is used to clarify the functions of transformers and attention mechanisms. We compare the results with more traditional methods, such as bag of words (BOW) and long short term memory (LSTM) networks.

READ FULL TEXT

page 6

page 9

research
10/16/2019

Evolution of transfer learning in natural language processing

In this paper, we present a study of the recent advancements which have ...
research
03/01/2022

Improving Performance of Automated Essay Scoring by using back-translation essays and adjusted scores

Automated essay scoring plays an important role in judging students' lan...
research
07/18/2017

On the State of the Art of Evaluation in Neural Language Models

Ongoing innovations in recurrent neural network architectures have provi...
research
06/24/2019

Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit Segmentation

Attention mechanisms have seen some success for natural language process...
research
09/16/2022

Transformer-based Detection of Multiword Expressions in Flower and Plant Names

Multiword expression (MWE) is a sequence of words which collectively pre...
research
11/30/2021

A Comparative Study of Transformers on Word Sense Disambiguation

Recent years of research in Natural Language Processing (NLP) have witne...
research
08/23/2017

A Neural Network Approach for Mixing Language Models

The performance of Neural Network (NN)-based language models is steadily...

Please sign up or login with your details

Forgot password? Click here to reset