HULK: An Energy Efficiency Benchmark Platform for Responsible Natural Language Processing

02/14/2020
by   Xiyou Zhou, et al.
0

Computation-intensive pretrained models have been taking the lead of many natural language processing benchmarks such as GLUE. However, energy efficiency in the process of model training and inference becomes a critical bottleneck. We introduce HULK, a multi-task energy efficiency benchmarking platform for responsible natural language processing. With HULK, we compare pretrained models' energy efficiency from the perspectives of time and cost. Baseline benchmarking results are provided for further analysis. The fine-tuning efficiency of different pretrained models can differ a lot among different tasks and fewer parameter number does not necessarily imply better efficiency. We analyzed such phenomenon and demonstrate the method of comparing the multi-task efficiency of pretrained models. Our platform is available at https://sites.engineering.ucsb.edu/ xiyou/hulk/.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/12/2023

A Comprehensive Analysis of Adapter Efficiency

Adapters have been positioned as a parameter-efficient fine-tuning (PEFT...
research
10/07/2022

Polyhistor: Parameter-Efficient Multi-Task Adaptation for Dense Vision Tasks

Adapting large-scale pretrained models to various downstream tasks via f...
research
02/15/2020

Fine-Tuning Pretrained Language Models: Weight Initializations, Data Orders, and Early Stopping

Fine-tuning pretrained contextual word embedding models to supervised do...
research
05/06/2020

An Empirical Study of Multi-Task Learning on BERT for Biomedical Text Mining

Multi-task learning (MTL) has achieved remarkable success in natural lan...
research
05/08/2021

Enhancing Transformers with Gradient Boosted Decision Trees for NLI Fine-Tuning

Transfer learning has become the dominant paradigm for many natural lang...
research
04/29/2021

Text-to-Text Multi-view Learning for Passage Re-ranking

Recently, much progress in natural language processing has been driven b...
research
09/18/2019

Improving Natural Language Inference with a Pretrained Parser

We introduce a novel approach to incorporate syntax into natural languag...

Please sign up or login with your details

Forgot password? Click here to reset