Water consumption of AI: Why chatbots are so thirsty

The contribution of water consumption of AI: Why chatbots are so thirsty first appeared at the online magazine Basic Thinking. You can start the day well every morning via our newsletter update.

AI water consumption artificial intelligence water

Regardless of whether during training, inquiries or in data centers: the water consumption of AI is enormous. But why are voice models like Chatgpt and Co. so thirsty?

When it comes to artificial intelligence, many think of eloquent chatbots such as chatt, smart image generators or self -driving cars. But such systems not only consume a lot of electricity, but also a large amount of water. The increasing water consumption of AI is now even a real problem for the environment.

The background: In huge data centers, high-performance computers run around the clock at full speed to train AI models and to answer inquiries. This creates a large amount of heat that can be dangerous for the technology.

Cooling down using water is therefore essential. In addition, the production of the hardware, primarily of the tiny chips that are in servers, devours large amounts of clean water.

Water consumption of AI: Small inquiries, big thirst

The numbers appear frightening: according to a current one study The Society for Computer Science could consume AI applications between 4.2 and 6.6 billion cubic meters of water worldwide by 2027. This is more than the entire annual water requirement of Denmark.

In Microsoft’s data centers, the training of the GPT-3 language model alone causes water consumption of around 5.4 million liters. A simple, AI-generated email can quickly consume half a liter of water depending on the model.

The problem is exacerbated by the fact that many data centers are built in regions that suffer from water shortages anyway. Examples are southern Europe or parts of Germany. There, AI and agriculture, industry and drinking water supply compete for scarce resources.

See also  Customer account or facial recognition: Ryanair harasses air travelers

Save water with a smart AI infrastructure

But there are also solutions. Among them: Smaller AI models that need less energy, economical cooling techniques or the use of sea water such as in the Google calculation center in Finland. Better standards for measuring and publishing the water consumption of AI could also help. Because: So far, many companies have been holding this information back.

The Society for Computer Science therefore calls for a more environmentally friendly digital infrastructure. If AI really helps to improve the world, it has to become more sustainable. After all, the valuable resource drinking water due to climate change will be scarce anyway.

Also interesting:

  • Green Prompting: How you can keep the energy consumption of AI in check
  • Researchers develop a method to produce hydrogen without electricity
  • Chatgpt alternatives: 5 AI models that you should know
  • Prompts: You shouldn’t reveal these things Chatgpt

The contribution of water consumption of AI: Why chatbots are so thirsty first appeared on Basic Thinking. Follow us too Google News and Flipboard Or subscribe to our update newsletter.


As a Tech Industry expert, I believe that the high water consumption of AI, particularly chatbots, is a significant issue that needs to be addressed. AI systems, including chatbots, require large amounts of data processing and cooling to function efficiently, leading to a high energy consumption and subsequently high water usage for cooling purposes.

One of the main reasons why chatbots are so thirsty is because they often run on powerful servers that require constant cooling to prevent overheating. This cooling process typically involves the use of water for cooling towers, which can result in significant water usage. Additionally, the training and development of AI models also require large amounts of data processing, which further contributes to their high water consumption.

See also  Use underfloor heating as air conditioning: This is how it works

To address this issue, it is crucial for companies to prioritize energy efficiency and sustainable practices in the development and deployment of AI systems. This includes investing in more energy-efficient hardware, optimizing algorithms to reduce computational requirements, and exploring alternative cooling methods that are less water-intensive.

Overall, as the demand for AI technologies continues to grow, it is essential for the tech industry to be mindful of the environmental impact of these systems and work towards developing more sustainable solutions to reduce their water consumption.

Credits