The Carbon Emissions of Training AI Models
What We’re Showing
The metric tonnes of CO2 equivalent emissions used to train AI large language models. Meta’s Llama 3 (70B) model and OpenAI’s GPT-3 model are compared alongside average car lifetime emissions and one passenger flight from New York to San Francisco.
Emission figures come from Meta, the 2024 Stanford AI Index Report and a 2022 study by Luccioni et al.
AI Training's Environmental Costs
Training AI models is contributing a considerable amount of emissions into the environment.
GPT-3's training emissions were over 500 times higher than a single passenger flight from New York to San Francisco, while Llama 3's emissions are 30 times greater than the lifetime emissions of an average car.
Emissions from training newer AI models are increasing. Llama 3, released in 2024, has almost four times the emissions of GPT-3, released in 2020.
Unlike many other companies, Meta does note that they offset the emissions produced by training their models.
Dataset
tCO2eq (Metric tons of CO2 equivalent) | |
---|---|
One passenger flight from New York to San Francisco | 0.99 |
Average car lifetime emissions (including fuel) | 63 |
GPT-3 (175B) | 502 |
Llama 3 (70B) | 1900 |