AI’s carbon footprint is growing. Is it worth it?
Apr 20, 2023

AI’s carbon footprint is growing. Is it worth it?

HTML EMBED:
COPY
Between mining for rare minerals, cooling data centers, and running computers for millions of hours, the climate impact of artificial intelligence is big and getting bigger.

Before you ask ChatGPT to write you another terrible haiku, you might want to consider the chatbot’s carbon footprint.

First, there are the resources needed to make the power-hungry computer chips these artificial intelligence tools run on. Then, there’s the millions of hours of intensive computing required to train large language models. Once that’s done, there’s the high energy demand of cloud storage, that helps keeps these systems connected to millions of users.

It all adds up, and Sasha Luccioni, the climate lead for the AI company Hugging Face, has been doing the math.

She spoke to Marketplace’s Meghan McCarty Carino about the process of training an earlier version of ChatGPT, which emitted roughly the same amount of carbon dioxide as a gas-powered car driving over one million miles.

Below is an edited transcript of their conversation.

Sasha Luccioni: New generations of AI models take literally millions of hours of compute time to train and that uses a lot of energy. To produce that energy, you have to use natural gas, oil, coal or hopefully hydroelectricity and that’s one of the big sources of AI emissions. Also, the hardware that’s being used has an environmental cost. AI training requires graphics processing units, or GPUs, which are designed to process large amounts of visual information quickly. To create these GPUs, you need a lot of rare metals and a lot of water as well. The same thing goes for cooling the GPU servers, because they need to be constantly cooled with water to prevent them from overheating.

There’s all these environmental aspects, but since AI is so ephemeral and such a vague concept, people have trouble realizing that it comes with like physical consequences and impacts. And I understand that confusion because AI doesn’t have a body. But it is running somewhere on a cloud server that’s being powered by coal electricity, we just can’t see it.

Meghan McCarty Carino: There are so many technologies that we use in our everyday life that have a lot of environmental externalities that we might not necessarily consider as much as we should. Are these large language models unique or extreme in that context?

Luccioni:  What worries me is that we’re going through this phase where people want to plug this technology into everything. Take web search for example. Right now, I think it does a pretty good job. But now, instead of continuing to use the existing web search models, which are mostly AI-based already but are just smaller, more efficient models, the current in vogue thing to do is to replace those models with big language models for what is ostensibly the same result. I can’t imagine that with this change the web search results getting like 1,000 times better all of sudden, but you are using a 1,000 times bigger model. We’re not doing this cost-benefit analysis. Instead, it’s more driven by what we call AI hype. Companies want to say, “this is powered by GPT” without saying “this is going to quadruple our energy consumption.” People don’t make that calculus in their mind.

McCarty Carino: That begs the question of whether there needs to be limitations on some of the more frivolous uses of these tools.

Luccioni: Yeah, that’s kind of why I do what I do. I’m not against innovation. I don’t think we should all just stop doing AI research. But for me, it’s kind of the basis of that research to say “this thing costs this much” not just in terms of money, but also in terms of planetary and human costs. Then, if that calculation makes sense, then yes, we’re going to use the AI. But currently, we’re not making that logical kind of decision. Right now, it’s more like “why shouldn’t we use the ChatGPT to do web search?” But I think environmental factors should be considered, because the new technology is lot less efficient than the current model.

McCarty Carino:  Could these tools and AI in general be part of climate solutions?

Luccioni: Definitely. I’m part of an organization called Climate Change AI and we made a menu for action of all these different ways that AI can be used. For example, detecting deforestation from satellite imagery. Because AI can parse so much data at once, it can compare the state of the Amazon 24/7 and then detect changes when there’s deforestation going on. Then it can alert rangers who can go in there and investigate. So, there’s lots of climate positive stuff.  

What’s kind of a shame is that our current discourse is more focused on these large language models and chatbots. Yeah, chatbots are cool, but I don’t see them as groundbreakingly useful in society. I think that people are just kind of impressed by their discourse and by their ability to mimic human language. We’ve become caught up on these kinds of AI innovations, but there’s so much more that exists that we that’s not on our radar. Maybe it’s just less appealing because it can’t write you a haiku.

Sasha Luccioni wrote an op-ed for Ars Technica earlier this month in which she explains the growing human and environmental cost of AI development. She outlines how large language models have been getting bigger and more energy-hungry with each leap.

The thing that’s really getting bigger in each new generation of LLM is their number of parameters — basically the number of connections inside of the model that allow it to learn patterns. Luccioni writes that in 2018, the largest language model had around 100 million parameters. Today, Google’s PaLM has more than 500 billion parameters.

And that’s just the biggest model that we can evaluate. OpenAI has been not-so-open about the number of parameters included in their newest model, GPT-4. It could have hundred of billions or even trillions of parameters. Or, as Luccioni told Bloomberg last month, it could just be “three raccoons in a trench coat.”

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer