So I’m opening up ChatGPT on my laptop. And it’s asking me: "What can I help with?" I’m going to type: "How much water does ChatGPT use?"
It says it’s thinking, it’s searching the web. Here’s what it says.
"For each ChatGPT interaction, estimates suggest that it could use around 500 milliliters of water."
Oh wow, so I basically just poured away the equivalent of a small bottle of water to find out the answer to that one question. And if I was to use ChatGPT just once a week, for a whole year, that would be like pouring away 27 litres of water - that’s the same as filling a standard kitchen sink right to the top, and then pulling the plug.
Artificial intelligence is transforming the world around us: offering increased productivity and promising to help tackle difficult problems like global warming.
But when we’re being warned that climate change is exacerbating water scarcity and told that we all need to reduce our carbon footprint, the rapid growth of AI poses a big challenge to the environment: it’s an AI sustainability paradox.
Welcome to LSE iQ, the podcast where we ask social scientists and other experts one intelligent question. I’m Anna Bevan from the iQ team. We work with academics to bring you their latest research and ideas – and talk to people affected by the issues we explore.
In this episode, I’m asking: Is AI destroying the planet?
I find out just how much water data centres use, discuss what an Irish goddess has to do with energy conservation, and learn how AI can help catch poachers in the Serengeti.
But first, I travel to Slough, about 20miles outside of Central London, to find out exactly how data centres work.
Train announcement: ‘The next station is Slough’.
Anna: So here I am, in Slough. It’s not the most glamourous of locations. Once famed for providing the backdrop to Ricky Gervais’ sitcom ‘The Office’, it’s now Europe’s largest data centre hub. These are the hubs that power huge chunks of the internet. So, there’s a good chance that pretty much everything on your phone or computer is stored here on Slough Trading Estate.
Raj: So we have the big kind of cloud providers with us: so your Googles and your Microsofts and your Amazons.
Anna: That’s Raj Ubhi – he works at Equinix – one of the biggest data centre companies in the world. It has 268 data centres across the globe. And 97% of all internet traffic flows through one of its data centres, so it’s huge. Raj is giving me a tour of one of the buildings on Equinix’s Slough campus, called LD6.
Anna: So what type of connections are happening right now?
Raj: Everything and anything you can think of. So, you know, even simple stuff like your internet connection, that could possibly run through here, most likely does, mine does. I only live around the corner, so mine definitely does. All the way up to how Teslas connect to their cars, how your fridge connects to, if you've got one of those fancy ones, how your TV works, your banks, your, anything you can think of. Ministry of Defence, you know, it's all running through data centres like ours, and we're the biggest so we have the most people coming through our data centres.
Anna: How come you’re based in Slough?
Raj: So LD4 is our longest standing data centre here and one of the main reasons is there is an undersea cable going to, u h, I believe it's New York in America, so it runs along the M4 and this data centre is right, placed right there, so basically all of the trading platforms and banks huddled into that area. That's why we had to make LD 4.2 because we were basically running out of space and power in that one building.
Anna: Let’s go take a look.
Raj: Yeah, yeah, sure.
Raj: The data centre is built like a sandwich. So you have the power generators, the generation system, I should call it, in the bottom floor. In the middle floor is all the data halls, and then the top is the cooling, because the heat rising - it's easier to expel it through the roof.
Bridge: We’re in a long metal corridor here, in this windowless warehouse, peering through the door into the generator room: it’s pretty much the engine of the whole operation – the bottom slice of bread of that ‘data centre sandwich’ that Raj described. There’s a mass of very neatly positioned wires - hundreds, or maybe even thousands of wires - big buttons, generators, pipes and fans. It’s kind of like looking at the engine in the front of your car...if your car was the size of a giant warehouse.
Raj: Security is obviously number one, all of the windows, I believe, are all kind of bulletproof.
Anna: So you've done fingerprint entrance to get in. And then another set of doors and another pass.
Raj: yeah, so the first door you have to get through, security let you in. Fingerprint in to get even into the building to begin with. You have to get your pass. And that will only let you in certain places. So a normal customer wouldn't be able to even come down here to begin with.
Anna: And how come the security is so tight?
Raj: It’s just to stop people coming in. You should only be going where you need to be going and the customers expect that. If you tell a customer, and they're a banking customer, for example, and you said, ‘oh yeah, there was someone walking around in your cage,’ I think they would just leave immediately.
Bridge: Raj talks a lot about cages. They’re basically floor to ceiling wire mesh enclosures that customers hire to house all of their servers, their hard drives, their networking equipment and the zillions of cables they use to ensure that data flows seamlessly between devices.
Anna: And what am I seeing right now? What have we got right here?
Raj: We call the UPS system on the right hand side. So there's uninterrupted power supply. We run off of an N plus one kind of foundation as we call it. So what will happen is with
Bridge: Equinix’s customers are requiring more and more power because of AI. Tools like Gemini or ChatGPT rely on hardware housed in data centres like this one. Just one AI query uses huge amounts of energy, which generates a lot of heat. So, then that hardware needs to be cooled down. And fast. That's where all the water comes in.
Raj: Because AI at the end of the day, it's just raw brute force that is running through these devices. So that's what the customers need. And the main challenge is, is does it need to be liquid cooled?
Anna: On the basic level, when we're talking about things needing to be cooled, why is that?
Raj: All fundamentals of how physics works right? If you rub your hands together it creates heat. So when these devices are working - you can kind of hear it right now - they all need air because they are generating that power doing the transfers and whatever workloads. They need air to cool them down. If you scale it from your laptop doing a Word document to, you know, an AI based workload doing, you know, whatever type of solution it is or kind of translation or whatever it is, um, just think of the scale in terms of what workload it's doing. And the heat and the type of energy used is scaled with that as well.
BRIDGE – A lot of these connections that data centres house are essential: like ensuring hospitals have uninterrupted access to patient information, or that banks can process financial transactions. But it’s also incredible to me that when I'm sat down at my laptop, asking ChatGPT a question like: ‘how much water does AI use?' these huge machines are still working behind the scenes – using a lot of power and a lot of water - to come up with the answer to that question. And recently I’ve noticed that sometimes when I use a search engine I don’t even get the option of whether I want to use AI or not, it just defaults to giving an AI overview. Professor Nick Couldry is a sociologist at LSE. He says we need to question the direction we’re going in when it comes to AI, for many reasons.
Nick: Some people calculate, it's obviously a complex calculation, that when you use ChatGPT to find something out that broadly you could have found out through a Google search as well - slightly different format. You're probably going to use six times as much electricity to do that. And that's because a Google search uses a very complex programme to find websites that give you a door that you can open and go in and find the answer yourself by clicking on the link. Whereas AI doesn't work that way. It processes everything across vast numbers of data sources trained on a large language model that enables it to predict what's the next most likely word in the sentence that you've just uttered, which was your question. And it predicts that the most likely next word will be whatever it is. That is an awful lot of calculation that previously wasn't necessary in the same way for a simple Google search. But we're now more and more relying on it instead of a Google search.
Anna: And sometimes you don't even get that option as to what you're using because if you type a question into Google now, it comes up with the AI overview.
Nick: Yes, and this raises questions about whether this is giving us the most useful answers and all sorts of other questions for the industries that rely on being linked to through a Google search, such as journalism. But there are also questions about the energy usage. And now you normally get an AI summary before you see the actual links. In fact, you have to look hard to get the links. But that means there's a default option now, which is: using more electricity, using more water. So we have to ask these questions about the sustainability of the general direction we're going in.
In the early days, businesses tried to put data centres into far north territories where they relied on the temperature, the external temperature to cool down the data centre. But in most places, that's not a factor, that's not adequate. You need water. So fresh water is often used. It turns out that in West London, Thames Water is extremely worried about water shortages for many reasons, probably, but one of them is data centres using water. And it's quite interesting that according to some estimates when you build a really big data centre, exactly the ones that are going to be needed to run our large scale AI, they can be using 1. 5 to 2 million litres of fresh water a day. We only use about 5 million litres of fresh water a day as human beings in a country. So that's an awful lot of fresh water.
Anna: And have we started to feel the effect of that yet, that vast consumption of water?
Nick: Well it's hitting some parts of the world, for example, where another aspect of AI comes home, which is the devices that we need to use in order to get the results of AI or to do anything on social media or on the web, such as our smartphones. They rely on batteries. The batteries are made of lithium. There are only two ways of making lithium. One of them involves sucking water up to the surface and letting it evaporate so the lithium gets left. That's done in places like the Atacama Desert in Chile, which is obviously a desert, so it's desperately short of water. And the local people the indigenous peoples have become extremely angry about that. Because they're short of water, and all this water is being wasted, as they see it, to produce lithium for batteries in computer devices. Similar disputes are going on in Argentina as well. Elon Musk, incidentally, has a very close relationship with the president of Argentina because he wants the lithium.
We think of data and AI as very ethereal. Uh, somehow up in the cloud or those sort of metaphors. But of course, it's just a vast number of chips which are communicating with each other so they can do huge numbers of simultaneous, parallel calculations. And sometimes it could be as many as a billion or even trillions of calculations are being run simultaneously because that's how many variables there can be in the very, very large AI models. But each chip itself has to be designed very carefully - mainly silicon, but also a lot of quite rare metals like gallium and palladium, which are in short supply. And so there are a lot of supply issues in the making of chips.
Bridge: So, AI isn’t just up in the cloud, it’s very much down on the ground - in data centres: using large amounts of water, electricity and even rare metals - placing a strain on already depleted natural resources. And while many governments around the world are actively looking to grow the AI industry, environmental campaigners are fighting to stop it.
I spoke to Ceara Carney, an actor and green activist from Ireland who recently dressed up as the goddess Eiru to protest against the number of data centres being built in her home country.
Ceara: I've always been aware of the climate crisis since I was a kid and really cared about nature but I only kind of got into activism in 2019 with the Green Wave when Extinction Rebellion kind of grew really big and the Fridays for Future movement and they had a really lovely open-door policy of anyone can join our movement. There are people from all types of backgrounds and skills who come together to create these protests.
There was one that we did in 2023, that was outside a data conference in the RDS where I dressed as Eirú. So that's the Irish name for the god personification of Ireland. It was basically her saying that you're coming leeching our resources, like they're going to data centres as opposed to the people, to communities.
Anna: And what is it about data centres in particular that frustrates or concerns you?
Ceara: Oh Anna, the big thing is Ireland is not energy secure. We import a lot of our energy and obviously us, and so many other countries in the EU, are trying to meet legally binding emission reduction targets by 2030. Data centres are using one fifth of our energy at the moment - that is going to grow. We have over 80 data centres working right now in Ireland, and we have another 20 more in construction or with planning permission granted and tens, tens more in the process of getting planning permission.
They are coming to Ireland because of Ireland's tax loophole for one thing. Like, it's no secret that Ireland is the European centre for so many businesses like Amazon and Meta and Microsoft because they get tax breaks and on top of that, we have a cool country, temperature wise, so they would have a much harder job keeping data centres cool in Las Vegas than they would here.
As far as I know, I think we're going to have, uh, 130 data centres here is the, like, projected, like, paperwork right now in the next five years. And I don't know what will stop these machines, these big, giant computer machines being built, uh, until we literally run out of physical space because they're huge. Like, that is the thing that I think will stop them before the government, unless people come together and really pressure the representatives to kind of switch priorities here from tax breaks and keeping corporations happy. To actually preserve an, uh, a liveable future that is thriving.
These data centres do not give back to the society at all. Like, there's, they are just leeches on our energy system and communities will pay not just in Ireland but perhaps abroad as well.
We're not saying in Ireland even to knock down the data centres here, we need a moratorium, we need to build no more until we have renewable energy, until they can redesign how they're set up to give back a little bit. Like, we're not saying, let's all erase data centres and stop sending emails and just send letters, we know we use data, we need it, but what it's being used for, a lot of it is not, is, is just not important.
BRIDGE: Both Ceara and Nick are concerned about the impact AI is having on the environment. But would it help if data centres were finding ways to give back to the communities that house them? Last year, Equinix, the company that gave me the tour of its Slough campus, used some of the excess heat from one of its data centres in Paris to heat the Olympic swimming pool during the 2024 Summer Olympics.
Eugenie Dugoa is an Environmental Economist at LSE who says there are many ways data centres can be designed to reduce their impact on the environment.
Eugenie: One way of designing data centre in a smarter way is instead of letting the heat dissipate is to reuse the heat for things where we need heat and we need heat in particular for heating houses.
In some cities like Stockholm or Helsinki, the excess heat is redistributed in the district heating system network. So the two systems are basically coupled, which is a good thing because, you know, that excess heat from the data centres is substituting away some heat that would have otherwise been generated by more fossil intensive generation. You can, in this case find a way of integrating, uh, the data centres, uh, in, into the architecture of the cities. Those sort of, uh, initiatives should, should be encouraged and incentivized, uh, in, in some ways.
Anna: And is it realistic that all data centres would be able to do that? Offset their water usage or their electricity consumption?
Eugenie: No, and there's not one silver bullet, right? There are several ideas, uh, many different ideas of how you can reduce energy consumption on one hand or water consumption on the other. Um, and depending on, you know, the geography where you are, like, the technological solutions are also going to be different.
So one idea is about where you locate the data centre. If you want to reduce, their carbon footprint, well, locate them near to where renewable energy, uh, is abundant, so close to, uh, a place that have good solar with potential or, or geothermal potential, like, so Iceland is a great example in there because it has a lot of geothermal potential.
There's no lack of ideas in how data centres could be designed slightly differently to make them less hungry and less thirsty and so on.
Anna: It’s estimated that data centres account for 1-1.5% of global electricity use. That’s according to the International Energy Agency. I asked Eugenie if we should be worried about this figure:
Eugenie: 1 percent is not nothing. It's also not a huge fraction, right? There are a lot of other sectors, a lot of important industrial sectors that emit a lot more emissions to this date. But, of course, why it is important to start talking carefully about this sector is because the pace of growth has been outstanding, right?
Mostly this is because of AI and the rise and the adoption and the take-off of a large language model, which is really increasing the demand for electricity and therefore the possibility of much higher carbon emissions in the future.
At the moment, it's still a small fraction, as I said, but the fact that it's really growing very fast is a very important reason to pay close attention and really start thinking carefully about how this particular sector can or should be regulated.
Anna: So there are ways that data centres can be designed differently to ensure they’re more environmentally responsible. But few countries are currently demanding this. Last year, the European Union moved one step closer towards regulating the industry’s energy efficiency by imposing mandatory disclosure of the energy performance of data centres in the EU. Here’s Eugenie again.
Eugenie: So, to be clear, they're not mandating anything else, right? They are not saying that they should increase the fraction of renewable energy that they're using in their electricity or they're not setting standards on how energy efficient this data centres are, but it's at least the first step.
The EU will be able to provide a sort of sustainability rating of each data centre, which can provide, you know, some sort of incentive to make progress in the right direction by a sort of mechanism like name and shame, right? If you can point fingers at the data centres that have the worst energy rating and exert some sort of stakeholder influence on that tech company you know that may help move things in the right direction.
Bridge: According to the United Nations, more than a hundred countries have committed to becoming net zero by 2050. This means balancing the amount of greenhouse gases going into the atmosphere. But data centres pose a significant challenge to those pledges. And in recent months we’ve seen many governments and companies backpedalling on their green promises. So is there any real motivation for data centres to become more efficient?
Eugenie: It is very expensive to run this data centres. So companies are going to have, uh, to some extent, uh, incentives to make this data centre more energy efficient, right? Because the less energy, the less electricity they need.
The demand for AI is going to scale up, but the demand for electricity is not going to scale up proportionally to the demand because these data centres are becoming more Uh, energy efficient by the day.
There are some fundamental mechanisms that are working on our side. For example, these centres have become more and more energy efficient over time. The improvement in energy efficiency truly have been astounding. Some numbers that I remember top of my mind are things like over the last maybe 10 years, every year has been an improvement in, in something like 20%, uh, energy efficiency, right?
Anna: So maybe reducing the costs involved in running these data centres will be the real motivation to improve their energy efficiency. Earlier this year, the Chinese AI startup DeepSeek made headlines after the latest version of its app promised to outperform competitors at a fraction of the cost and using a fraction of the energy – providing hope for that AI sustainability paradox. But Nick Couldry says it’s not just the environmental impact that we should be concerned about.
Nick: We’re increasingly moving to a situation where businesses are encouraging us to use AI for almost everything, to write our letters, to write a letter to a friend to say, I can't come to your party and a bit embarrassed. How do I put it? You can ask ChatGPT.
So the question to ask is whether that's the right thing for the environment and also whether it's the right thing for the social environment, our social world. Whether it's right for us to be dependent on AI in that way. Is that a good thing for us to do as a society?
What I want to do is raise the question whether this is the right deal we're doing with AI. There's no question that there's some uses of AI which are really sensible, such as looking at an x ray in the NHS. We're short of staff, we're short of radiologists to do that scanning. Maybe it does make sense to use AI to come up, at least with first approximations, where the patterns might be, whether the patterns in these x rays are a sign of something worrying or not. I'm all in favour of that. But that's a situation where the answer is something that a human expert can then say, ‘that's a hallucination, that's way off.’ Or ‘no, that could be right. I need to look at that more closely.’ So we can immediately see good uses of AI and this vast calculative power which fulfil a social goal and where we might think the energy cost is absolutely justified. But then something else, just finding an answer to a question that you didn't really need to ask anyway, or you could have found out some other way, but using six times as much electricity. The calculus is completely different when we think about the environmental cost so sustainability for me has to be the core of how we build the right contract around AI.
There's also the social side. Supposing you lose the habit of answering questions that a few years ago you would easily answer. The risk is that we end up deskilling ourselves.
So I think we need to look really carefully at the social contract we seem to be building today around the use of AI, because it may have costs that at this stage we can't see. But once we enter down that path, it will be too late to correct. We could easily get this wrong. We need to have the debate now. And that does mean pausing temporarily the headlong rush to move towards AI at all possible costs. Yes, there is a geopolitical race because the US and China want to lead the world in AI. Every other country wants to be part of the AI race and not to be left behind.
But that's not necessarily the best basis on which to have a balanced debate about the risks and benefits of AI that truly benefits citizens and societies and the environment. That's what we need to do. And I think that's the role of academics and social scientists to contribute to that better debate than we're having at the moment, either in the UK or anywhere else.
Anna: Eugenie Dugoa also wonders if the AI path we’re currently travelling down is the right one.
Eugenie: I think what is really striking in how AI is being developed at the moment is like, it's mostly a couple of large, very large tech firms that are owning the technology.
So it's mostly, uh, private sector entities. And we have to ask ourselves what are the interests of the private sectors and to what extent are they aligned with public interest and maybe more generally like public goods? I'm pretty sure some interests may be aligned, but not all of them. And I think it will be in the public's interest, um, to maybe even invest in a publicly owned, publicly funded AI initiative so that AI algorithm for the public good can also be developed and made available.
BRIDGE: We’ve heard about the amount of water that AI uses, the amount of electricity it consumes, and the amount of rare earth metals that are needed for data centres to function. But Eugenie also tells me that AI could actually help us deal with some of the effects of climate change.
Eugenie: On adaptation, there are really cool example there on early warning systems for, for example, floods or wildfire.
There is this system that was launched by Google, the Google's flood forecasting AI, which has been already in place and has sent out text messages to millions of people when the system detected that there could be a flood. So floods are very difficult to predict, right? You need to have a lot of information about how much rainfall has been there. You need to understand the geography of the place you're at, the levels of the river and so on. So that's definitely something that now with the satellite information that we have and, you know, new sensors that, uh, or place that are capturing information in real time, this algorithm of predicting real time, whether a flood event is likely to happen in your neighbourhood or not.
Anna: And how can AI be used to monitor the earth and protect biodiversity?
Eugenie: So I think this is a really interesting area because I think so far, you know, we've been very limited in how we could protect Earth's natural ecosystems. There's an example in, in Africa, in the Serengeti. Where a technology called the trail guard AI is basically used to detect potential poachers.
Um, so this is a very large area, right? And you only have, I think, about 100 or so rangers that are basically, tasked with roaming around and, and trying to catch poachers if, if they find some. So it's obviously something very, that's very difficult to do because the area is very large. But now this system that combines some cameras potentially, uh, drone images as well, and, and with a sort of recognition algorithm.
Sends messages within 30 seconds. If a potential poacher is detected, it sends a message to the rangers that can then go and intervene. And that has actually led to the arrest to dozens of, uh, potential poachers before they killed the big animal, which is really the important point here, because, of course, the goal here is to protect endangered species, right?
I think AI is neither a hero nor a villain at this point. I think AI is like a tool. We can make great things out of it and we could make potentially very bad things out of it.
So humans have a major role to do here in terms of shaping the direction of this new technology. That's why it's really important to talk about these things and think about what are the incentives already at play and how could we change incentives to make sure that future developments are better aligned with human welfare.
And that includes, you know, aspects on, uh, you know, labor and environment, but it could also be much broader than that, right? So, in a way, AI is a little bit like fire. It can warm you up or it can burn you. It depends, uh, how you use it and, uh, maybe how, if you know how to use it well.
Anna: This episode of LSE iQ was produced and edited by me, Anna Bevan, with script development from Sophie Mallet and on-location sound recording from Oliver Johnson. We’ll be taking a break over the summer but will be back with a new season of iQ in September.
In the meantime, why not attend the LSE Festival - either in person or online - from the 16th to the 21st of June. Our world-leading speakers will be exploring the threats and opportunities of the near and distant future, and what a better world could look like. For more information, visit lse.ac.uk/festival.
And if you enjoyed this episode on AI and sustainability, check out LSE’s AI, technology and society series where we’re exploring AI’s ability to do good, and how to limit its ability to do harm, through short films, events, blogs and podcasts.
Artificial intelligence is transforming the world around us, offering increased productivity and promising to help tackle difficult problems like global warming.
But behind the scenes, its environmental costs are mounting. From massive energy use to vast quantities of water required to cool data centres, AI’s footprint is growing fast. So, in an age of water scarcity and climate crisis, can we justify this technological boom?
In this episode of LSE iQ, Anna Bevan asks: Is AI destroying the planet?
She travels to a data centre in Slough to find out exactly how data centres work, and speaks to Nick Couldry, Professor of Media, Communications and Social Theory at LSE; Eugenie Dugoua, Assistant Professor in Environmental Economics at LSE; and Ceara Carney, an actor and climate activist.
This episode explores the AI sustainability paradox: can AI be both a climate solution and a climate problem? And discusses surprising ways AI is being used for good, such as catching poachers in the Serengeti.
Research
Data Grab: The New Colonialism of Big Tech and How to Fight it, Nick Couldry and Ulises Mejias
The Space of the World: can Human Solidarity Survive Social Media and What if it Can't? Nick Couldry
Induced innovation, inventors and the energy transition, Eugenie Dugoua and Todd D. Gerarden
Directed technological change and general purpose technologies: can AI accelerate clean energy innovation? Pia Andres, Eugenie Dugoua and Marion Dumas
Could artificial intelligence deliver a green transition? Marion Dumas
LSE iQ is a university podcast by the London School of Economics and Political Science. We’re keen to find out more about our audience so we can better tailor our content to suit your interests. With this in mind, we would be grateful if you could please take the time to fill out this short survey and share your feedback.