On the Use of BERT for Automated Essay Scoring: Joint Learning of Multi-Scale Essay Representation

05/08/2022
by   Yongjie Wang, et al.
0

In recent years, pre-trained models have become dominant in most natural language processing (NLP) tasks. However, in the area of Automated Essay Scoring (AES), pre-trained models such as BERT have not been properly used to outperform other deep learning models such as LSTM. In this paper, we introduce a novel multi-scale essay representation for BERT that can be jointly learned. We also employ multiple losses and transfer learning from out-of-domain essays to further improve the performance. Experiment results show that our approach derives much benefit from joint learning of multi-scale essay representation and obtains almost the state-of-the-art result among all deep learning models in the ASAP task. Our multi-scale essay representation also generalizes well to CommonLit Readability Prize data set, which suggests that the novel text representation proposed in this paper may be a new and effective choice for long-text tasks.

READ FULL TEXT
research
10/03/2022

Characterization of effects of transfer learning across domains and languages

With ever-expanding datasets of domains, tasks and languages, transfer l...
research
07/14/2021

Large-Scale News Classification using BERT Language Model: Spark NLP Approach

The rise of big data analytics on top of NLP increases the computational...
research
05/18/2019

BERTSel: Answer Selection with Pre-trained Models

Recently, pre-trained models have been the dominant paradigm in natural ...
research
03/14/2022

SUPERB-SG: Enhanced Speech processing Universal PERformance Benchmark for Semantic and Generative Capabilities

Transfer learning has proven to be crucial in advancing the state of spe...
research
09/24/2021

AES Systems Are Both Overstable And Oversensitive: Explaining Why And Proposing Defenses

Deep-learning based Automatic Essay Scoring (AES) systems are being acti...
research
05/27/2020

Catching Attention with Automatic Pull Quote Selection

Pull quotes are an effective component of a captivating news article. Th...
research
09/11/2020

A Comparison of LSTM and BERT for Small Corpus

Recent advancements in the NLP field showed that transfer learning helps...

Please sign up or login with your details

Forgot password? Click here to reset