Evolution of transfer learning in natural language processing

10/16/2019
by   Aditya Malte, et al.
0

In this paper, we present a study of the recent advancements which have helped bring Transfer Learning to NLP through the use of semi-supervised training. We discuss cutting-edge methods and architectures such as BERT, GPT, ELMo, ULMFit among others. Classically, tasks in natural language processing have been performed through rule-based and statistical methodologies. However, owing to the vast nature of natural languages these methods do not generalise well and failed to learn the nuances of language. Thus machine learning algorithms such as Naive Bayes and decision trees coupled with traditional models such as Bag-of-Words and N-grams were used to usurp this problem. Eventually, with the advent of advanced recurrent neural network architectures such as the LSTM, we were able to achieve state-of-the-art performance in several natural language processing tasks such as text classification and machine translation. We talk about how Transfer Learning has brought about the well-known ImageNet moment for NLP. Several advanced architectures such as the Transformer and its variants have allowed practitioners to leverage knowledge gained from unrelated task to drastically fasten convergence and provide better performance on the target task. This survey represents an effort at providing a succinct yet complete understanding of the recent advances in natural language processing using deep learning in with a special focus on detailing transfer learning and its potential advantages.

READ FULL TEXT
research
01/19/2020

Deep Learning for Hindi Text Classification: A Comparison

Natural Language Processing (NLP) and especially natural language text a...
research
09/18/2019

Language models and Automated Essay Scoring

In this paper, we present a new comparative study on automatic essay sco...
research
11/15/2021

Say What? Collaborative Pop Lyric Generation Using Multitask Transfer Learning

Lyric generation is a popular sub-field of natural language generation t...
research
02/03/2021

Neural Transfer Learning with Transformers for Social Science Text Analysis

During the last years, there have been substantial increases in the pred...
research
04/20/2011

Understanding Exhaustive Pattern Learning

Pattern learning in an important problem in Natural Language Processing ...
research
01/13/2013

Cutting Recursive Autoencoder Trees

Deep Learning models enjoy considerable success in Natural Language Proc...
research
09/08/2019

Back to the Future -- Sequential Alignment of Text Representations

Language evolves over time in many ways relevant to natural language pro...

Please sign up or login with your details

Forgot password? Click here to reset