Compression of Deep Learning Models for Text: A Survey

08/12/2020
by   Manish Gupta, et al.
60

In recent years, the fields of natural language processing (NLP) and information retrieval (IR) have made tremendous progress thanks to deep learning models like Recurrent Neural Networks (RNNs), Gated Recurrent Units (GRUs) and Long Short-Term Memory (LSTMs) networks, and Transformer based models like Bidirectional Encoder Representations from Transformers (BERT). But these models are humongous in size. On the other hand, real world applications demand small model size, low response times and low computational power wattage. In this survey, we discuss six different types of methods (Pruning, Quantization, Knowledge Distillation, Parameter Sharing, Tensor Decomposition, and Linear Transformer based methods) for compression of such models to enable their deployment in real industry NLP projects. Given the critical need of building applications with efficient and small models, and the large amount of recently published work in this area, we believe that this survey organizes the plethora of work done by the 'deep learning for NLP' community in the past few years and presents it as a coherent story.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2023

A Comprehensive Survey on Applications of Transformers for Deep Learning Tasks

Transformer is a deep neural network that employs a self-attention mecha...
research
11/08/2021

A Survey on Green Deep Learning

In recent years, larger and deeper models are springing up and continuou...
research
03/03/2018

The History Began from AlexNet: A Comprehensive Survey on Deep Learning Approaches

Deep learning has demonstrated tremendous success in variety of applicat...
research
02/04/2023

Knowledge Distillation in Vision Transformers: A Critical Review

In Natural Language Processing (NLP), Transformers have already revoluti...
research
06/05/2020

An Overview of Neural Network Compression

Overparameterized networks trained to convergence have shown impressive ...
research
03/26/2021

A Practical Survey on Faster and Lighter Transformers

Recurrent neural networks are effective models to process sequences. How...
research
05/03/2018

A Deep Learning Model with Hierarchical LSTMs and Supervised Attention for Anti-Phishing

Anti-phishing aims to detect phishing content/documents in a pool of tex...

Please sign up or login with your details

Forgot password? Click here to reset