Training a Single AI Model Can Emit As Much Carbon As Five Cars in Their Lifetimes

Scope

In (Hao, 2019) the author explores the environmental impact of training large artificial intelligence (AI) models, particularly in the field of natural language processing (NLP).

Summary

AI’s carbon footprint is a growing concern, as training a single AI model can generate as much CO2 as the lifetime emissions of five cars. Research from the University of Massachusetts shows that training large NLP models like BERT, ELMo, and GPT-2 can produce over 626,000 pounds of CO2—more than four times the emissions of an average American car. NLP training is highly energy-intensive, with computational needs scaling exponentially with model size. Neural Architecture Search significantly contributes to this, adding computational overhead without proportional improvements, further increasing energy consumption and emissions. These estimates are baseline figures, and real-world emissions could be higher due to repeated training in research and development. This creates a disparity between academia and industry, as academic researchers often lack the resources to train large models. The article urges the AI community to develop more energy-efficient algorithms and hardware, as the current trend toward larger models is environmentally and financially unsustainable. 

Relevance for EXIGENCE

This article quantifies the training of large AI-models with carbon footprint, highlighting that the emissions from a single model can equal those of five cars over their lifetime. Therefore, it underscores the urgency of developing energy-efficient AI solutions—an objective that directly aligns with EXIGENCE’s goal to reduce energy consumption and carbon emissions in technology systems.  

  1. Hao, ”Training a single AI model can emit as much carbon as five cars in their lifetimes”, MIT Technology Review, 2019. 

Index