Greater, newer AI models come with environmental impacts
Jun 13, 2024

Greater, newer AI models come with environmental impacts

Emma Strubell of Carnegie Mellon University explains why carbon emissions increase with more AI data centers and more powerful AI features.

Back in 2020, Microsoft made an ambitious pledge to go carbon negative by 2030. But the plan to erase its carbon footprint and then some is encountering some headwinds, according to its latest sustainability report.

It showed Microsoft’s carbon emissions have increased by 30% since it made that pledge four years ago, as it accrues more semiconductors and builds more data centers that power generative artificial intelligence. The report came as a reminder of the significant environmental cost of the AI boom.

Just how significant? Marketplace’s Meghan McCarty Carino asked Emma Strubell, a computer science professor at Carnegie Mellon University. She co-wrote a paper about the specific energy demands for common uses of this technology. The following is an edited transcript of their conversation.

Emma Strubell: We looked at, for example, how much energy does it take to generate an image using one of these generative AI models, and comparing that to how much energy does it take to generate a chat response if you’re chatting with ChatGPT. And so there we found that generating an image can take as much energy as potentially fully charging a smartphone and generating text. So generating that chat response uses a little bit less energy. It’s closer to 16% of a charge.

Meghan McCarty Carino: Are newer models using more energy than they have in the past?

Strubell: Yes, it’s sort of hard to say precisely. I mean, in this study, we looked at both the state-of-the-art generative AI models, but we also looked at some of the sort of older generation, previous generation of models, and there’s a huge difference. I mean, they’re using a multiple times more energy for both the building and the use.

McCarty Carino: And why is that?

Strubell: Basically, the best way that we know how to continue adding these new and exciting capabilities to the models is simply just to make them larger. And so that means that they require a lot more computation, which corresponds to energy use.

McCarty Carino: So it’s not just computation, right? What else is kind of involved in the deployment of models?

Strubell: So there’s a lot more that contributes to the sort of overall carbon footprint and like, more broadly, the environmental impacts of these models. In terms of the broader environmental impacts, water use for cooling the data centers and manufacturing the hardware is a big issue, especially because these things often happen in communities that actually do have limited water availability. And then mining rare earth minerals also is required for manufacturing with specialized hardware that actually makes this new generation of technology possible. One other aspect I would like to mention is also the specific applications that the technology is being used for. So how is it actually being used in the world? And what are the downstream changes and emissions that are resulting from that? So AI is a tool, and like all tools it has dual use. So in the context of the environment, this means that it can be used in ways that might have an overall net benefit to the environment or that might be more harmful.

McCarty Carino: How can it be used in ways that would have a net benefit?

Strubell: One area that I actually work on, and I’m excited about, is understanding how we can use AI to accelerate discoveries in materials science. So these are going to be innovations that are required to build the next generation of batteries and solar cells and things like that, that are required for scaling up renewable energy, for example. AI is also really promising for solving some challenges of optimization in the electrical grid, which is also a barrier to scaling up renewable energy sources.

McCarty Carino: Microsoft was very ambitious among tech companies in setting this goal to be carbon negative by 2030. But they definitely were not alone in the tech industry in kind of talking up commitments to sustainability goals. How much does the AI boom threaten that?

Strubell: It doesn’t necessarily threaten those promises, but not for the reasons that you think. I definitely think that the AI energy use is already and is going to continue to be a significant portion of the carbon emissions that these companies are having. So the zero-carbon pledges are largely supported by purchasing carbon offsets rather than actually switching to renewable energy or otherwise, or reducing the effect of carbon footprint.

McCarty Carino: Are there developments that could make training and running generative AI more efficient?

Strubell: Definitely. That’s an area that I work on, that I’m excited about. So both on the hardware side and the software side, there’s a lot of things that we can do and are already doing to do that. I will say that I think that’s not going to be sufficient for solving this problem.

McCarty Carino: Right, because we keep hearing about the AI processors becoming more efficient. Nvidia’s new Blackwell superchips, for instance, are supposedly 25 times more energy efficient than their predecessors. Are these advances not enough?

Strubell: It’s my opinion that they’re not, due to something that’s called rebound effects in economics. And so the idea — and we’ve already been seeing this, and I think we’ll continue to see it — is that as you reduce the cost of technology, or as you improve the efficiency of a technology, you basically make it more widely available for use. And so the use is just going to increase proportional to the increase and efficiency essentially.

McCarty Carino: Put this all into context for me, because when you say the amount of energy needed to charge your iPhone, that doesn’t necessarily sound like so much on its own. But what’s the big picture here?

Strubell: That number is the amount required to generate a single image. So if you think about, with one of these models, especially as they continue to get more powerful and have more interesting and exciting applications, the number of times we’re going to use that model to generate a single image is going to be really frequent. If you imagine each individual person has a single smartphone, and they’re charging that once a day, well, how often is each single person or are you going to be sort of generating an image or sending a single message to a chat model, especially as they become increasingly integrated into our daily lives and our work? So I think that’s actually pretty substantial. And it’s only going to increase.

More on this

We talked about the big role of data centers, which are now responsible for the lion’s share of energy consumption, not to mention the water needed to keep them cool. A recent report from the industry nonprofit the Electric Power Research Institute found the energy used by data centers could double by the end of the decade, sucking up almost 10% of electricity in the United States.

A Reuters article about the paper quotes a figure from several energy companies that a new data center requires as much electricity as 750,000 homes, which is probably why we’ve seen a spate of Wall Street types recently talking up investment in utility and energy markets as a way to cash in on the AI boom if Nvidia stock is a little out of reach.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer