The Cost of Training NLP Models: A Concise Overview

04/19/2020
by   Or Sharir, et al.
9

We review the cost of training large-scale language models, and the drivers of these costs. The intended audience includes engineers and scientists budgeting their model-training experiments, as well as non-practitioners trying to make sense of the economics of modern-day Natural Language Processing (NLP).

READ FULL TEXT
research
05/07/2023

LatinCy: Synthetic Trained Pipelines for Latin NLP

This paper introduces LatinCy, a set of trained general purpose Latin-la...
research
02/15/2022

A Survey on Model Compression for Natural Language Processing

With recent developments in new architectures like Transformer and pretr...
research
04/30/2022

EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing

The success of Pre-Trained Models (PTMs) has reshaped the development of...
research
09/20/2023

Making Small Language Models Better Multi-task Learners with Mixture-of-Task-Adapters

Recently, Large Language Models (LLMs) have achieved amazing zero-shot l...
research
08/28/2016

What to do about non-standard (or non-canonical) language in NLP

Real world data differs radically from the benchmark corpora we use in n...
research
05/23/2022

DistilCamemBERT: a distillation of the French model CamemBERT

Modern Natural Language Processing (NLP) models based on Transformer str...
research
09/29/2020

Utility is in the Eye of the User: A Critique of NLP Leaderboards

Benchmarks such as GLUE have helped drive advances in NLP by incentivizi...

Please sign up or login with your details

Forgot password? Click here to reset