Attention-based Contrastive Learning for Winograd Schemas

09/10/2021
by   Tassilo Klein, et al.
0

Self-supervised learning has recently attracted considerable attention in the NLP community for its ability to learn discriminative features using a contrastive objective. This paper investigates whether contrastive learning can be extended to Transfomer attention to tackling the Winograd Schema Challenge. To this end, we propose a novel self-supervised framework, leveraging a contrastive loss directly at the level of self-attention. Experimental analysis of our attention-based models on multiple datasets demonstrates superior commonsense reasoning capabilities. The proposed approach outperforms all comparable unsupervised approaches while occasionally surpassing supervised ones.

READ FULL TEXT
research
05/02/2020

Contrastive Self-Supervised Learning for Commonsense Reasoning

We propose a self-supervised method to solve Pronoun Disambiguation and ...
research
03/16/2023

All4One: Symbiotic Neighbour Contrastive Learning via Self-Attention and Redundancy Reduction

Nearest neighbour based methods have proved to be one of the most succes...
research
07/20/2021

Group Contrastive Self-Supervised Learning on Graphs

We study self-supervised learning on graphs using contrastive methods. A...
research
07/10/2023

Hate Speech Detection via Dual Contrastive Learning

The fast spread of hate speech on social media impacts the Internet envi...
research
05/24/2023

Contrastive Training of Complex-Valued Autoencoders for Object Discovery

Current state-of-the-art object-centric models use slots and attention-b...
research
06/10/2020

DisCont: Self-Supervised Visual Attribute Disentanglement using Context Vectors

Disentangling the underlying feature attributes within an image with no ...
research
01/17/2022

ICLEA: Interactive Contrastive Learning for Self-supervised Entity Alignment

Self-supervised entity alignment (EA) aims to link equivalent entities a...

Please sign up or login with your details

Forgot password? Click here to reset