The article Water consumption by AI: Between scaremongering and real numbers first appeared in the online magazine BASIC thinking. With our newsletter UPDATE you can start the day well informed every morning.

The enormous water consumption of AI data centers is a recurring argument, especially among critics, against the unbridled expansion of digital infrastructures. But how high the actual demand is only becomes apparent upon closer inspection: climate, cooling technology and electricity mix play a key role in determining the ecological balance.
Water is used in data centers primarily to cool servers, which generate large amounts of waste heat when computing. Depending on the technology, it is used directly in evaporative cooling systems or indirectly through the water consumption of the power plants that supply the required electricity.
However, there are currently numerous calculations circulating around the water consumption of AI data centers, some of which clearly contradict each other. Estimates range from comparatively moderate amounts to alarming scenarios.
Just recently, journalist Karen Hao had to do a calculation from her bestseller “Empire of AI.” row back. She published in her book that a planned Google data center near Santiago de Chile could require “more than a thousand times the water consumption of the entire population.” However, there was a calculation error – the number was much too high.
Because of such incidents, experts warn against absolutizing individual values. Location factors, the technical equipment of the data centers and the electricity mix used can have a major influence on how water-intensive AI actually is.
KI: What factors influence water consumption?
The water consumption of AI data centers depends heavily on numerous factors. For example, the use of more water can mean that there is no need to operate electrical cooling systems.
If more electricity is used to cool the data centers, water consumption decreases. However, at the same time – depending on the electricity mix – greenhouse gas emissions can also increase again.
“Every location is different,” explains Fengqi You, professor of energy systems engineering at Cornell University, opposite Wired. “How much water you need for the same amount of AI depends on the climate, the technology used and the [Energie-]Mix it up.”
Problematic water consumption depends on the location
At the same time, what makes things even more difficult is that some calculations also include indirect water consumption. However, the entire water footprint is only estimated, which also includes figures for electricity generation. As a result, it could be that the assumed figures from estimates are much larger than the actual water consumption on site.
Nevertheless, the water consumption of AI data centers should not be underestimated. “In the short term, this is not a problem or a national crisis,” says Cornell professor You. “But it depends on the location. In places where there is already a water shortage, building these AI data centers will be a big problem.”
However, calculations are very complex due to the numerous factors. EstimatesFor example, the fact that writing an email with ChatGPT uses an entire bottle of water is therefore hardly applicable for an “average” query.
Also interesting:
- AI skills: 7 skills for more control and better results
- This is how you find out everything ChatGPT knows about you
- AI hype: Why we should slow down
- ChatGPT: Activate the reminder function – this is how it works
The article Water consumption by AI: Between scaremongering and real numbers appeared first on BASIC thinking. Follow us too Google News and Flipboard or subscribe to our newsletter UPDATE.
As a Tech Industry expert, I believe it is important to approach the topic of water consumption by AI with a balanced perspective. While there has been some scaremongering around the idea that AI systems are massively depleting global water resources, the reality is that the actual numbers tell a different story.
It is true that training AI models can be computationally intensive and require significant amounts of power, which in turn can have an impact on water consumption. However, when we look at the data, the amount of water used by AI systems is relatively small compared to other industries and activities.
In fact, a study by the University of Massachusetts found that training a large AI model, such as OpenAI’s GPT-3, consumes about the same amount of energy as a car driving for one second. While this may seem like a lot on an individual level, when we consider the potential benefits that AI can bring in terms of efficiency and sustainability, the trade-off may be worth it.
That being said, it is important for companies and researchers to continue to explore ways to reduce the environmental impact of AI systems, including finding more energy-efficient algorithms and investing in renewable energy sources. By approaching the issue with a data-driven mindset and a commitment to sustainability, we can ensure that AI technologies are developed in a responsible and ethical manner.
Credits