Building an Efficiency Pipeline: Commutativity and Cumulativeness of Efficiency Operators for Transformers

07/31/2022
by   Ji Xin, et al.
0

There exists a wide variety of efficiency methods for natural language processing (NLP) tasks, such as pruning, distillation, dynamic inference, quantization, etc. We can consider an efficiency method as an operator applied on a model. Naturally, we may construct a pipeline of multiple efficiency methods, i.e., to apply multiple operators on the model sequentially. In this paper, we study the plausibility of this idea, and more importantly, the commutativity and cumulativeness of efficiency operators. We make two interesting observations: (1) Efficiency operators are commutative – the order of efficiency methods within the pipeline has little impact on the final results; (2) Efficiency operators are also cumulative – the final results of combining several efficiency methods can be estimated by combining the results of individual methods. These observations deepen our understanding of efficiency operators and provide useful guidelines for their real-world applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2018

Rethinking the Value of Network Pruning

Network pruning is widely used for reducing the heavy computational cost...
research
08/20/2022

Combining Compressions for Multiplicative Size Scaling on Natural Language Tasks

Quantization, knowledge distillation, and magnitude pruning are among th...
research
07/09/2019

UW-BHI at MEDIQA 2019: An Analysis of Representation Methods for Medical Natural Language Inference

Recent advances in distributed language modeling have led to large perfo...
research
06/04/2021

How Good Is NLP? A Sober Look at NLP Tasks through the Lens of Social Impact

Recent years have seen many breakthroughs in natural language processing...
research
07/19/2023

Efficiency Pentathlon: A Standardized Arena for Efficiency Evaluation

Rising computational demands of modern natural language processing (NLP)...
research
05/20/2023

Dynamic Transformers Provide a False Sense of Efficiency

Despite much success in natural language processing (NLP), pre-trained l...

Please sign up or login with your details

Forgot password? Click here to reset