
-
English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too
Intermediate-task training has been shown to substantially improve pretr...
read it
-
Intermediate-Task Transfer Learning with Pretrained Models for Natural Language Understanding: When and Why Does It Work?
While pretrained models such as BERT have shown large gains across natur...
read it
-
jiant: A Software Toolkit for Research on General-Purpose Text Understanding Models
We introduce jiant, an open source toolkit for conducting multitask and ...
read it
-
Do Attention Heads in BERT Track Syntactic Dependencies?
We investigate the extent to which individual attention heads in pretrai...
read it
-
Generalized Inner Loop Meta-Learning
Many (but not all) approaches self-qualifying as "meta-learning" in deep...
read it
-
Inducing Constituency Trees through Neural Machine Translation
Latent tree learning(LTL) methods learn to parse sentences using only in...
read it
-
Investigating BERT's Knowledge of Language: Five Analysis Methods with NPIs
Though state-of-the-art sentence representation models can perform tasks...
read it
-
The Unbearable Weight of Generating Artificial Errors for Grammatical Error Correction
In recent years, sequence-to-sequence models have been very effective fo...
read it
-
Grammar Induction with Neural Language Models: An Unusual Replication
A substantial thread of recent work on latent tree learning has attempte...
read it
-
Training a Ranking Function for Open-Domain Question Answering
In recent years, there have been amazing advances in deep learning metho...
read it