Every conversation you have with ChatGPT uses water. Not metaphorically — actual water, pumped through the cooling systems of the data centres running the computation, evaporated into the atmosphere and gone. A 2023 study from researchers at UC Riverside and the University of Texas Arlington estimated that GPT-3 consumed around 700 millilitres of fresh water per conversation of roughly 20-50 questions and answers. That is about one standard water bottle per conversation.
GPT-4, which is significantly more computationally intensive, uses more. The researchers were unable to calculate exact figures because OpenAI does not publish its infrastructure details. The company that is most transparent about the environmental cost of AI is Microsoft, which publishes annual sustainability reports that document increasing water and electricity consumption year over year while simultaneously promising to be carbon negative by 2030.
Why Water and Not Just Electricity
The electricity cost of AI is discussed more often, partly because it converts more easily into a carbon footprint metric that companies can include in ESG reporting. The water cost is less well understood and less well disclosed because water consumption accounting in data centres is not standardised and not required in most jurisdictions.
Data centres generate enormous heat from the servers running computation. That heat has to go somewhere. The two main cooling strategies are air cooling and evaporative cooling. Evaporative cooling, which is cheaper and more efficient at scale, works by pumping water through the system and allowing it to evaporate, carrying the heat with it. The water is consumed — it does not go back into the municipal system.
Microsoft’s data centres used approximately 6.4 million cubic metres of water in 2022, up from 3.9 million in 2020. The company’s own sustainability report attributed a significant portion of the increase to AI workloads. Google reported similar increases. Amazon’s AWS does not break out water consumption by service type.
Where the Water Comes From Matters
A data centre in Iceland cooling its servers with geothermal energy and glacial water is a different environmental equation from a data centre in Arizona drawing from the Colorado River, which is in the middle of a multi-decade drought that is reducing water availability for tens of millions of people. Data centre location decisions are made primarily on the basis of energy cost, land cost, and tax incentives. Water stress in the surrounding region is a secondary consideration at most.
Mesa, Arizona — one of the fastest-growing data centre markets in the US — sits in one of the most water-stressed regions on the continent. Microsoft, Meta, and Google all have major facilities there. The Trump administration’s approach to AI regulation has included loosening environmental review requirements for data centre construction, accelerating approvals in locations where the environmental impact has not been fully assessed.
The Carbon Accounting Problem
AI companies have made aggressive net-zero pledges. The mechanisms for meeting those pledges rely heavily on carbon offsets — paying someone else not to emit carbon, or to plant trees, rather than directly reducing emissions. The scientific consensus on carbon offsets is increasingly sceptical: a significant portion of offset projects do not deliver the carbon reductions they claim, and some actively harm the communities living near them.
Microsoft’s pledge to be carbon negative by 2030 was made in 2020 when its AI compute workloads were a fraction of what they are now. Its 2024 sustainability report showed that emissions had increased 29 percent since the pledge was made, primarily due to AI infrastructure. The 2030 deadline is still in the report. The trajectory is going in the opposite direction.
When the AI bubble eventually deflates, the stranded assets will include billions of dollars of data centre infrastructure built in water-stressed regions on the assumption of indefinite growth. The environmental cost of that build-out will remain long after the hype cycle moves on.
[…] in the hundreds of gigawatt-hours, equivalent to powering tens of thousands of homes for a year. The environmental cost of AI computation is already being systematically hidden by the companies run…, and the shift to trillion-parameter models will make that cost substantially larger while those […]