Curb Your Carbon Emissions: Benchmarking Carbon Emissions in Machine Translation

09/26/2021
by   Mirza Yusuf, et al.
0

In recent times, there has been definitive progress in the field of NLP, with its applications growing as the utility of our language models increases with advances in their performance. However, these models require a large amount of computational power and data to train, consequently leading to large carbon footprints. Therefore, is it imperative that we study the carbon efficiency and look for alternatives to reduce the overall environmental impact of training models, in particular large language models. In our work, we assess the performance of models for machine translation, across multiple language pairs to assess the difference in computational power required to train these models for each of these language pairs and examine the various components of these models to analyze aspects of our pipeline that can be optimized to reduce these carbon emissions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2021

Should we Stop Training More Monolingual Models, and Simply Use Machine Translation Instead?

Most work in NLP makes the assumption that it is desirable to develop so...
research
05/11/2023

How Good are Commercial Large Language Models on African Languages?

Recent advancements in Natural Language Processing (NLP) has led to the ...
research
05/23/2023

In-context Example Selection for Machine Translation Using Multiple Features

Large language models have demonstrated the capability to perform well o...
research
09/20/2023

Towards Effective Disambiguation for Machine Translation with Large Language Models

Resolving semantic ambiguity has long been recognised as a central chall...
research
09/13/2023

Simultaneous Machine Translation with Large Language Models

Large language models (LLM) have demonstrated their abilities to solve v...
research
04/19/2023

Low-resource Bilingual Dialect Lexicon Induction with Large Language Models

Bilingual word lexicons are crucial tools for multilingual natural langu...
research
02/02/2015

Scaling Recurrent Neural Network Language Models

This paper investigates the scaling properties of Recurrent Neural Netwo...

Please sign up or login with your details

Forgot password? Click here to reset