Everything we know about the climate costs of AI come from voluntary disclosures by the biggest tech companies. In their annual environmental sustainability reports, many of them selectively disclose electricity and water usage.
The numbers are stark. In the past five years, Google’s electricity consumption has increased 186 percent, Microsoft’s has increased 186 percent, and Meta’s has increased 367 percent. In the past year, Microsoft’s water consumption went up 34 percent in 2022 over the prior year, and Google’s increased 22 percent.
In other words, the costs of AI on our shared resources is rising. But as Proof reporter Aaron Gordon points out in my interview with him in the Proof Ingredients video series, those costs are just those that we know. There are many companies - such as Amazon - that don’t report as much data about their impact on the environment. And there are third party AI data centers that are much less energy and water efficient than those run by the Big Tech giants who care about their environmental reputation.
“The bleak picture I'm painting here is probably the rosiest part of an even bigger, bleaker picture,” Gordon told me in our interview. In our conversation he talks about what we know – and what we still don’t know – about the climate impacts of AI.
A transcript of our conversation, edited for brevity and clarity, is below:
—-----
Angwin: How does AI affect the climate? When we think about the things we do in our daily lives that contribute to climate change, we think about driving cars, flying in planes, eating red meat. You may not think about using generative AI, but it turns out that programs like ChatGPT use a ton of electricity and water and contribute to carbon emissions. How much exactly is hard to know, because the big tech companies don't tell us a lot of details.
So at Proof News, we decided to try to find out. Reporter Aaron Gordon spent six months working on a three part series about the electricity, energy and water demands of AI. In this video series, Proof Ingredients, I talked to Aaron about what went into his investigation, what he found out and what he wasn't able to find out.
Aaron, welcome and thanks for joining.
Gordon: Of course. Thanks so much for having me, Julia.
Angwin: So first, let's just talk about you. You, as a reporter, have been covering a lot of different issues. What was your history with writing about tech and climate? And what did you come into this thinking about?
Gordon: Sure, so for about five years before I started at Proof, I had been covering transportation, technology, climate and infrastructure in various different ways, for a couple of different publications, most notably probably, Motherboard, which was Vice's tech and science website. And, covering climate and tech is always kind of a tricky thing because it is simultaneously one of the industries that likes to talk about itself as one of the most climate friendly industries in the world, and as part of the solution, so any time you try to write about the costs of what they're doing, you inevitably run up against this kind of inevitable doublespeak. And I think one of the hardest parts and one of the most fun and challenging parts of covering climate in tech is like sorting through their doublespeak. And that's definitely something I was dealing with covering AI and climate and technology for Proof.
Angwin: Yeah. One of the things that we should dispel right away, which is that there is actually a whole genre of propaganda, literature, whatever you want to call it, about how AI is going to solve climate change. And this is all basically what I would call speculative fiction, but I think it's best to put it aside, right. We were not actually examining those claims.
Gordon: I'm really glad you mentioned that because when I first started looking at the academic literature on what we actually know about artificial intelligence's climate impact, by far the bulk of the scientific literature is filled with these speculative studies in many cases funded or produced by big tech researchers, which isn't that nefarious, right, they're researchers. They publish things that's not necessarily the problem, but the problem is that, to me at least, the problem is that these studies are all based on conjecture, and they're all based on hypotheticals of what may be able to be accomplished. But they create this body of research that kind of leads the scientific community to the default that this will be good for the climate, based on virtually no evidence.
Meanwhile, the actual, demonstrable, real impacts of artificial intelligence on the environment have already started happening, not speculative, not projecting the future. There's much, much less on that impact. It's really worth looking at. Right now, we're having this boom in tech where all the big tech giants are creating new models every six months and then a new generation of AI. These are basically this generative AI, which is the chat bots — ChatGPT, Claude, Llama — that consumers may or may not actually be familiar with but certainly heard about even if they aren't using it themselves. And what we're seeing is that that is actually expensive in terms of climate.
Angwin: So let's break it down. You looked into electricity, energy and water demand. So, why don't we start with electricity. What do we know about the electricity demands of generative AI?
Gordon: Sure. So, what we know — it's kind of hard because my brain almost immediately jumps to what we don't know, but I'll try and focus on what we do know first, which is we know it uses a lot of electricity to run the servers that these technologies rely on. I think a lot of the language we use around artificial intelligence is industry language meant to promote this kind of idea that this is all very strange and unknowable technology, and only the experts can really talk about it intelligently. So I just want to back up a little bit and make clear what we're actually talking about. We're talking about large computers in giant warehouses, around the country and around the world.
We're just talking about big computers when we talk about their energy use. There's a lot of mystification about what artificial intelligence is. But for the purposes of this conversation, we can skip that, because whatever programs they're running, right, they're on these particular types of chips called GPUs, which are very energy intensive. If you're a gamer, they're very similar to the chips that you've been using for years, but just basically more advanced versions of them. They get stuffed into servers, and then there are hundreds or thousands of servers in this warehouse. It requires a lot of electricity to run all these servers, and the servers get so hot that they require cooling. And all of this is very energy and resource intensive. When we get into the water part, we'll talk about the cooling a bit more.
But to power all of these servers and all these warehouses around the country, they have to draw from the energy grid. Now, the biggest tech companies like Google and Meta like to give this impression that they're somehow powering their computer warehouses essentially entirely with green energy. But that's just not true. And if you read their own documentation on it very carefully, they do admit how energy intensive it is.
You can compare it a bunch of different ways. You can try and compare it to the average US household or the average city or state. We saw a rash of studies come out last year that tried to compare projected AI energy use to the country of Ireland or whatever. And I don't think that was a particularly good comparison for a lot of different reasons. But what we do know is that data center warehouses by and large, which include a lot of things other than artificial intelligence, they're becoming an increasingly large share of global electricity use, and it's becoming roughly on par in terms of emissions with the airline industry, which is also getting a lot of attention, for like, how do we decarbonize air transportation and, both of them, I think it's worth noting, are a relatively small percentage of overall global electricity use. Right? Like, we're not talking about a giant slice of the pie. We're talking about a couple of percentage points of global electricity use. But that's a lot of electricity.
So these are important questions and the reason why people are so concerned about these is because they're projected to grow a lot in the next decade. And so people who are concerned about the environmental cost of airlines or AI want to try and restrict that before it gets out of control, before we built so many of these warehouses running off dirty energy that it becomes a clock you can't roll back. And I think a good analogy for that is what happened with the crypto industry, where they grew so fast and they took over so many old, they built so many warehouses that it grew before people even realized what was happening. And now it's a significant chunk of US electricity use as well.
Angwin: So, here's a question I have for you. The companies don't make it easy to find out, actually, how much of their electricity use is being used for AI. Do we know specifically, or do we just generally know Google's electricity use has skyrocketed?
Gordon: Great question. No, they don't break out AI or generative artificial intelligence electricity use from the companies as a whole. You just have to look at the company's overall electricity use. Some companies break it down by data center. And so this is something Meta and I believe Google do as well where you can look specifically at how much electricity each data center is using on a year-over-year basis. And this provides you a decent proxy of AI electricity use, which I was initially skeptical that would be the case before I started reporting. But what I quickly saw is that the rapid increase in these companies' electricity use coincides with their expansion into artificial intelligence.
It became very clear, very quickly that, around the same time these companies started investing very, very heavily in artificial intelligence was also when their electricity use skyrocketed. And I'm sure there are other things going on there too. But experts broadly agree that yes, it's AI that's driving this electricity use. So, it's a yes and no type of thing. No, companies don't specifically say how much of their electricity use goes to artificial intelligence. But we can make reasonable assumptions to conclude that most of their electricity use is from AI.
But that's not true of all companies that are heavily invested in AI and all tech companies. Like Amazon, for example, Amazon discloses almost nothing about its electricity use. It only started reporting its electricity use at all as a company in 2022. At the time I was reporting this, because of the lag in how companies report their environmental goals, there was only that one data point, which wasn't helpful at all. And Amazon as a company does so much more in terms of its lines of business than Google and Meta do that you can't break it out by lines of business if they don't tell you. So that made them almost impossible to say anything useful about. So it does vary by company as well.
Angwin: It's worth pointing out that you're working off of voluntary disclosures, right? There's no law that requires them to disclose this. These are these environmental sustainability reports that I believe they just do because they want to look good to the world.
Gordon: When Google first started doing environmental sustainability reports, and I think Apple too, a long time ago, they were essentially just glorified press releases. They were nice glossy reports. They had almost no useful information in them. They would report a couple of random data points, but it was mostly promotional material. ‘Here's how good we are for the environment’ type of thing. And look, if companies are doing good things for the environment, they should tell people about it. That's fine. But the problem is, as you say, there's no disclosure requirements around this stuff. So independent researchers have to essentially work through these promotional materials to try and get useful information.
And that's where you start running up against a lot of challenges as a researcher or reporter, because companies will generally only disclose information they want the public to know. And so if they have bad news to share, it's very rare that they're going to be very transparent about bad news. They might put it in there, they might bury it. They might massage the numbers to make it look a little different. They might change the metric to one that makes them look better or isn't as obviously bad. You know, these are things that I dealt with, especially in water use where there's a lot —
Let me back up. Electricity use actually does have some international standards on at least the units that they have to report them in, which does make it easier to compare across companies. But, for water use, there's absolutely no standard around even the units they're supposed to use to report. Some do metric tons, others do billions of gallons, across the U.S. Even just that little kind of voluntary standardization doesn't exist in some very important areas.
Angwin: Wow, so essentially you're doing some forensics on these voluntary disclosures. Then you've also looked at the literature that independent researchers had done. And it seems like it's not that easy for independent researchers to do this work either, because they can't get into the data centers and measure anything. What was it that you found that was the most reliable there that you could work from?
Gordon: Yeah independent researchers run into the same problems we do as reporters. They don't have some secret back channel to technology companies like, ‘Hey, can I see your electricity bill?’ Generally they're working from the same stuff we are. There are exceptions, obviously, like researchers sometimes form partnerships with companies. But I didn't see anything like that in terms of overall electricity use or energy efficiency that I thought were particularly useful for the investigations we were doing.
The one exception, I will say, and I think this is actually pretty noteworthy because it highlights the evolution or almost devolution, within AI research, which is before the fall of 2022, it was in all the publications.
The recent generation of AI research looking at large language models and that type of thing that we're seeing proliferate now, started around 2017, 2018 when it really started blowing up. And that's when we see the energy use skyrocket, too, for these companies. And at that time, a very common thing that researchers were doing was measuring the electricity requirements, essentially how much electricity they consumed in training their models. When they would publish research on what these models could do, they would often include how much electricity they use in training the model as part of that scientific research.
This was incredibly useful when looking at how much electricity it uses to create these models. As we talked about in the videos we published for Proof, that's only a small part of the electricity consumed by AI models, but it kind of tells the story of how initially, disclosure on the environmental front was really baked into the scientific research of what they were doing. And I don't think they were doing this for environmental environmentally conscious reasons. They weren't doing it because they were super concerned about the environment or whatever, although maybe some of them were. But the fundamental reason they were doing it is because it's expensive to use all of that electricity.
And so creating a more efficient model that could be trained using less electricity was better business, and it was a more impressive result. If you could get the same quality model for less electricity usage. But when it became clear that AI was going to be this multi-billion, multi-trillion dollar industry, anything like that was considered essentially giving away trade secrets, and they stopped doing it.
There was actually a very noteworthy paper from Google researchers that talked about, I think it was in 2021, if I'm remembering correctly, where they published a paper that was essentially like, it is necessary for AI researchers to disclose the environmental costs of their models, especially training their models and using their models. And one thing that would be very easy for us to do as researchers is just be transparent about how much electricity they use when we publish our research. They said in the paper essentially, it's like the lowest bar they could clear and there's no reason not to do it. And then they haven't published any such information since.
Angwin: Somebody higher up in the company must have gotten ahold of that paper and said, ‘No, I don't think so.’
Gordon: I think that was just a different time in AI research, where it was just not this global arms race that it's become now. There was a lot more openness, too. I think this is something you've probably run into a lot in your time as a reporter too where there's almost a pre and post phase of any technological research. The pre phase when it's just a bunch of nerds tooling around, who really want to build something. And they very much buy into this idea of a scientific community furthering human human progress or whatever. And they consider the exchange of ideas a key part of that process. And they're real believers in the scientific process and the scientific method.
And that was that era, that pre-2022 era of AI research to an extent, and then 2022 hit and everyone buttoned up because all of the senior executives started to realize how much money this was about. And all the publishing stopped, essentially, on this stuff. You can find papers written by OpenAI that disclose all these benchmarks about GPT-3 and GPT-2 and GPT-1, but nothing on 3.5 or 4 because those came out after the big boom.
Angwin: That's one of the big challenges for really assessing this – getting any sort of real data. It sounds like you had the same challenge with water usage. What were you able to find out and how and then what were you still missing?
Gordon: Yeah. So water — you run into a lot of the similar challenges of electricity, plus the added challenge that there's even less standardization around norms and reporting. Companies also haven't been reporting water usage for data centers for as long as they have electricity. Concern about water usage is a bit newer than electricity usage.
To back up and just explain why water is important here. I mentioned earlier that these servers, you're running thousands of these servers in a warehouse and they generate a ton of heat. It gets really hot and the company needs to cool the environment, to cool the servers so that they can keep running and they don't overheat. And there are various different methods of cooling servers.
One of them, and the most commonly used one, is to basically pipe water through and around the servers and that cools them down. And you can reuse water. The water generally comes from the local water supply to cool these servers. It can cycle through — it depends on the facility, but usually around four times before the water itself gets too hot and it gets mineral buildup. And then in most cases, it then just gets put back out into the water supply, which means back into the local stream or river or just piped back through the same conduit pipes. In most cases it'll then forward onto a water treatment plant. That's in most cases, how it works.
So why is water usage important? Well, obviously, with climate change, water use is becoming a much bigger concern in many places around the world. Obviously using water in places with droughts is really bad. And these facilities use so much water that it can actually have a noticeable impact on the available water supply in a local area. So even on a short term basis, if a watershed is experiencing a water shortage, having a major data warehouse or data center — or many data centers — can have a noticeable impact on the local water supply.
And then obviously, in the long run, as many places that people live are experiencing water shortages, it just begs the question: Is this the best use of the scarce water supply that we have? And I think a lot of people would argue that it probably isn't.
Some companies report on this better than others. Meta is actually one of the most transparent in terms of its water usage. Google is also fairly transparent in its water usage. Amazon reports almost nothing. Fortunately, there has been some independent research on this subject that made it easier to do this research than if I was having to do it all myself.
There's a researcher and a research team out of UC Riverside that has been looking at the water usage of data centers generally for more than a decade. And they've published some excellent research, both on AI and also on data centers more broadly. Because before AI became such a hot thing data centers still existed, obviously, but they used less power hungry chips called CPUs. You know, if you use a computer, it has a CPU. And these are just generally less energy intensive.
So they haven't been quite as big of a drain on the electricity supply or the water supply then AI data centers need to be. And so, yeah, through a lot of their research, I was able to get some, really useful data on how water usage has increased even more than electricity usage, through this AI era. That was a huge help.
Angwin: We have an interview with one of those researchers that will also be running on this channel. So anyone who's watching, stay tuned for that. [The interview is now up on our YouTube channel]
I do want to mention, there's this sort of famous thing that people might have heard about, every time you do a query in AI, it's like pouring out a bottle of water. And that stat, which I think is not actually per query, but per 10 to 50 queries, comes from that UC Riverside paper. And we'll be diving into it with Dr. Ren.
Gordon: That was actually the stat that was the fundamental question at the heart of the research I was doing into the water question. I wanted to find out if that stat was true because I had heard it a lot. And it's not that I didn't believe it as much as I just thought it was a pretty fantastic claim, and I really wanted to know if it was true or not. And like so much else, this was kind of a challenge.
With any type of climate reporting, the details really matter. And you can get a wide range of results depending on what exactly you're looking at or what type of facility you're talking about, what type of computer chip you're talking about, what kind of cooling mechanism does it use? You know, how efficient is the data — is its electricity processing? What types of servers is it using. All these questions affect the bigger questions of how much electricity is AI using.
With the water bottle question, it was certainly true for some facilities like the least efficient facilities, but it's not true for the most efficient facilities. And so how do you present an answer that gets at this nuance without being so bogged down in details that you just lose people entirely?
Angwin: The problem is, who knows? You don't know when you're using AI, what type of data center it's being sent to. So there's no way for anyone to make an informed decision like, ‘Oh, I'm using one of the more efficient data centers, and so I don't need to feel bad about my energy choices here.’
Gordon: Yeah, I even get the feeling that a lot of people at the big tech companies who are working on AI don't even know. If you're training a model, you certainly know where it's being trained, which data centers you're using. But if I put in a query to Bard or OpenAI or whatever, they have many different data centers, and I think it would actually take a significant amount of work to figure out where any individual query is being processed. I don't think that's how the system architecture is designed to work.
And it gives lie to this idea of ethical energy consumption within these system architectures, as if there's some consumer responsibility. We have one toggle essentially, which is to use it or not to use it. We can't make any other decisions in terms of how efficient we want to be when using these systems. And I think we've all been taught living in a capitalist society, the onus then gets put on individuals to do things most efficiently or to be environmentally responsible with our own decisions. And then if we all make the environmentally responsible decision or enough of us do, it will move society in an environmentally responsible direction.
This is a good example of when that's not possible, because I can certainly not use these systems. That's fine. But like, a lot of times I don't even get to choose whether to use them or not, like Gmail, for example, is consistently putting more of these AI-based processes into the basic function of its email. And so it's like, I can move my email from Gmail but it's like, at some point we're just getting ridiculous with these like consumer choices. The fundamental fact is we have no switch, we have no toggle.
Angwin: Of course the companies are going to say that they are basically buying their way out of this, by buying carbon credits and offsets and water offsets, right? You also looked into this whole system of what they are doing to try to remediate their usage and to move to green energy. How much are they able to do? Is that actually achievable?
Gordon: There's been a transition in the way these companies think about it as they have consistently missed targets. It's been a classic case of moving goalposts. So, like 10 and more years ago, these companies talked constantly about carbon offsets. You know, Google famously says it's been carbon neutral since like 2008 or something like that. And what they mean by that is they have purchased enough carbon credits, like building basically like buying something good for the environment somewhere else to offset the bad thing they do for the environment with their own company operations. And they say they've been evening this out since 2008.
Now, there's a whole body of research out there, which comes long before Proof News existed, that says that these carbon credits are generally not effective, And the proof is so overwhelming at this point that even the organizations that sell carbon credits have had to take drastic measures to rethink how they sell carbon credits in a way that doesn't completely devalue the concept because everybody knows they're useless. So as this has become more and more clear — just to give an example, one famous example is the Massachusetts Audubon Society sold carbon credits for a forest, basically promising not to cut down the woodlands that it owns, and then sold the carbon credits to. And when this was uncovered, people asked the very basic question, which is, in what universe would the Audubon Society cut down a forest? Obviously they were never going to. So that makes the whole concept of the carbon credit worthless because it's supposed to be saving something that was going to be cut down or building something that otherwise wouldn't be built. And this just happens over and over and over in the carbon offset markets.
So the companies then evolved their approach and Google's goal starting in 2019, I believe it was, was they wanted to have 24/7 carbon-free energy, which is what they called it. And basically they wanted to build enough green energy, like wind farms and solar farms on the same grid that their data centers were housed, so that the entire time their data centers were running, they were being powered entirely by green energy that they built. So there's none of this offset situation, none of this buying credits. It's just like, literally building green energy that goes on the same grid that you're using, which is good. That was a good change. Definitely better than like offsets.
The problem was that at the same time their energy usage was drastically increasing because of their AI ambitions. They were building so many new data centers that were using so much electricity that they couldn't build enough green energy to keep up with it. When they started measuring this in, I think, 2020 or 2021 was the first year that they started publishing this metric of what percent of the time are their data centers running on carbon free energy? The number was like right around 60%. They made a few percentage points progress, until they expanded their data centers so much that now they're going backwards. And it's like, basically — it's about where they started five years ago, and they had a goal of being 100% carbon free energy by 2030.
And we're like halfway through that ten year span where they were trying to accomplish that and they made essentially no progress.
Angwin: Oh my gosh. That doesn't give us a lot of hope that they're going to meet that goal.
Gordon: It seems pretty unlikely. And this is why you hear them talking so much more about moonshots in carbon free energy. Like small-scale nuclear or nuclear fission because they can't think in terms of the scale of existing technology like wind farms and solar farms because they're building too much new energy demand that they won't be able to accomplish with reasonable technology. They have to think of something that's completely new and novel and almost certainly won't exist by 2030.
Angwin: So what is it that you feel like we still need to know to understand more, because it sounds like you're painting a pretty bleak picture already. I mean, skyrocketing electricity and energy usage and water usage, kind of, struggling to mitigate those, through green energy sources. But what is it that we might not even know, right? Because we have such a limited visibility into this? Because we're relying on what, few academic papers maybe, I don't know, a couple dozen? And the industry's own reports.
Gordon: There's — one of the biggest things that concerns me is that this is only a tiny picture of overall AI energy usage. When these companies report on their data center energy usage, they're talking about the data centers that they own. Now, there's a whole other category of data centers, which are called co-location facilities, which are built by third party companies that then rent out the rack space, the data and the servers, to other companies to use. Google's cloud service, Microsoft's cloud service, Amazon's cloud service are versions of this, essentially. But then there are lots of other independent companies that do the same thing and sometimes the big companies rent out rack space from them if they don't want to go through the expense of building out a new facility because they don't think they need the space for that long, or other reasons.
These companies almost never report their electricity usage in environmental reports or anything like that. And they tend to also be much less efficient than these big technology company data centers, because it's actually very expensive upfront to build an efficient data center. It's much, much cheaper to build an inefficient data center. Anyone who's ever gone through the process of building a house has probably experienced this. The upfront costs of making a house very energy efficient will be higher than to just put up a structure that leaks heat and cool air.
It's the same with data centers. And we have almost no visibility into the environmental cost of those data centers. There's been some reporting into it, but it relies on a lot of third party data that there's reason to think isn't complete or isn't capturing the whole picture. A lot of what we can say about the data center industry as a whole are just estimates. So I think that's one of the biggest things where what we're reporting on here is just a small slice of what's going on overall. And it provides us a window into the most efficient slice. So even this bleak picture I'm painting here is probably the rosiest part of an even bigger, bleaker picture.
Angwin: Amazing. And we haven't even talked about the manufacturing of the chips, which is its own bleak picture. Give us a short window into that.
Gordon: Almost all of the chips that are manufactured for the AI industry are made by one company in Taiwan called TSMC. And they report almost nothing about the environmental cost of all their chip manufacturing. All we know is that the environmental costs are huge. Because this company is such a huge portion of Taiwan's environmental impact that we can just look at Taiwan's environmental impact over the course of the last ten years and see the impact of TSMC's chip manufacturing in that national data.
Angwin: This is all very disturbing. Thank you very much. You know, I do want to just talk a little bit about the fact that this is a unique type of investigation in the sense that it was essentially an analysis of literature and academic papers and interviews with experts. There's no way for us as journalists to measure this ourselves. So we don't have firsthand data. And that makes it really challenging just in general for covering climate. So in your dream world, what would we have access to to have a better handle on this?
Gordon: Before I worked for Proof, I covered a lot of things in the transportation industry, including in the automotive industry. I did a lot of work on the environmental impact of vehicles whether they were electric or gas. And there are lots of different types of reporting requirements for vehicles. You know, it's like if you go to buy a car, you can see how many miles per gallon it gets. You can see the kilowatt hour per mile it is expected to get. Now, obviously, sometimes companies cheat on these standards and that becomes a huge scandal. But the very idea that they have to cheat in order to try to get around requirements is a completely foreign concept to everything we've just been talking about for the last 40 minutes, because the companies have no requirement to do any of this. So instead of cheating, they can just not say anything, or they can say things the way they want to say them as opposed to in a standardized way.
So what I would love to see is something more like what we see in the automotive sector, where there are energy efficiency requirements for new data centers, where data centers are required to report their electricity consumption per hour, and at the end of the year, they have to report what was their electricity consumption per hour, per day, whatever makes sense.
I think there should also be reporting requirements on electricity requirements for AI model training and electricity requirements for AI model usage. I don't think energy usage is a trade secret. I think it's of public interest in the age of a climate crisis that we're all invested in and that the public has a right to know where our resources are going. And right now, we do not have that ability. We just can't answer those questions. I think no company can claim to be environmentally responsible if it's not being transparent about its electricity usage.
Angwin: Well, from your lips to God's ears, I would love to see that too. I know that there are some people who are trying to push for some type of energy star type ratings for AI. So fingers crossed that maybe we get some transparency. In the meantime, we have to rely on people like you. So thank you, Aaron, for your diligent reporting and for joining us on Proof Ingredients.
Gordon: Thanks, Julia.