A Surprisingly Robust Trick for Winograd Schema Challenge

05/15/2019
by   Vid Kocijan, et al.
0

The Winograd Schema Challenge (WSC) dataset WSC273 and its inference counterpart WNLI are popular benchmarks for natural language understanding and commonsense reasoning. In this paper, we show that the performance of three language models on WSC273 strongly improves when fine-tuned on a similar pronoun disambiguation problem dataset (denoted WSCR). We additionally generate a large unsupervised WSC-like dataset. By fine-tuning the BERT language model both on the introduced and on the WSCR dataset, we achieve overall accuracies of 72.2 solutions by 8.5 are also consistently more robust on the "complex" subsets of WSC273, introduced by Trichelair et al. (2018).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/07/2022

The Defeat of the Winograd Schema Challenge

The Winograd Schema Challenge – a set of twin sentences involving pronou...
research
05/04/2020

The Sensitivity of Language Models and Humans to Winograd Schema Perturbations

Large-scale pretrained language models are the major driving force behin...
research
04/22/2019

Exploring Unsupervised Pretraining and Sentence Structure Modelling for Winograd Schema Challenge

Winograd Schema Challenge (WSC) was proposed as an AI-hard problem in te...
research
05/31/2019

Attention Is (not) All You Need for Commonsense Reasoning

The recently introduced BERT model exhibits strong performance on severa...
research
01/23/2022

An Application of Pseudo-Log-Likelihoods to Natural Language Scoring

Language models built using semi-supervised machine learning on large co...
research
05/19/2019

HellaSwag: Can a Machine Really Finish Your Sentence?

Recent work by Zellers et al. (2018) introduced a new task of commonsens...
research
11/05/2018

On the Evaluation of Common-Sense Reasoning in Natural Language Understanding

The NLP and ML communities have long been interested in developing model...

Please sign up or login with your details

Forgot password? Click here to reset