On the Effective Use of Pretraining for Natural Language Inference

by   Ignacio Cases, et al.
Stanford University

Neural networks have excelled at many NLP tasks, but there remain open questions about the performance of pretrained distributed word representations and their interaction with weight initialization and other hyperparameters. We address these questions empirically using attention-based sequence-to-sequence models for natural language inference (NLI). Specifically, we compare three types of embeddings: random, pretrained (GloVe, word2vec), and retrofitted (pretrained plus WordNet information). We show that pretrained embeddings outperform both random and retrofitted ones in a large NLI corpus. Further experiments on more controlled data sets shed light on the contexts for which retrofitted embeddings can be useful. We also explore two principled approaches to initializing the rest of the model parameters, Gaussian and orthogonal, showing that the latter yields gains of up to 2.9


page 1

page 2

page 3

page 4


GreekBART: The First Pretrained Greek Sequence-to-Sequence Model

The era of transfer learning has revolutionized the fields of Computer V...

An Exploration of Word Embedding Initialization in Deep-Learning Tasks

Word embeddings are the interface between the world of discrete units of...

Embeddings and Attention in Predictive Modeling

We explore in depth how categorical data can be processed with embedding...

Using Similarity Measures to Select Pretraining Data for NER

Word vectors and Language Models (LMs) pretrained on a large amount of u...

Improving Natural Language Inference with a Pretrained Parser

We introduce a novel approach to incorporate syntax into natural languag...

PTT5: Pretraining and validating the T5 model on Brazilian Portuguese data

In natural language processing (NLP), there is a need for more resources...

Dataset for a Neural Natural Language Interface for Databases (NNLIDB)

Progress in natural language interfaces to databases (NLIDB) has been sl...

Please sign up or login with your details

Forgot password? Click here to reset