AI Chatbots’ Thirst May Kill Our Water Resources

Think about it—you may need half a litre of water to cook two packets of Maggi instant noodles. Now, what if we told you that a single chat online with AI tools like ChatGPT consumes the same amount of water? It may seem small, but when millions of people use chatbots on a daily basis, it becomes a lot, and it increases the combined water footprint.

According to a study on ChatGPT’s water consumption byA. Shaji George, an expert in Information and Communications Technology (ICT), the AI chatbot consumes 0.5 litres of water during each of its lengthy conversations with a user. This applies to all AI systems and LLMs in place.

In another research study, ‘Secret Water Footprint of AI Models,’ by Pengfei Li and Shaolei Ren of the University of California, Riverside, it is projected that the pressure AI workloads are exerting on freshwater resources.

The 2027 projection, shows that against the concern for global water scarcity, the world’s demand for AI would lead to amounts of water withdrawal – freshwater taken from the ground or surface water sources, temporarily or permanently. This withdrawal would somewhat be equal to the total annual water withdrawal of countries like Denmark and half of the United Kingdom, i.e., between 4.2 to 6.6 billion cubic meters.

This is particularly significant when considering the yearly human freshwater consumption, which is around 4 trillion cubic meters, as per the United Nations World Water Development Report and Food and Agriculture Organisation (FAO).

AI training data centres are responsible for a significant portion of water usage. As per the research paper, even after putting aside the water usage by third-party facilities, “Google’s self-owned data centres alone directly withdrew 25 billion litres and consumed nearly 20 billion litres of water for onsite cooling in 2022, the majority of which was potable water.”

How is AI using water?

Water usage by AI systems is categorised into three areas. At first, it includes water consumption for the data centres that house AI workloads. These centres contain high-performance servers and require significant cooling agents due to the continuous generation of heat. Most of their cooling systems, like towers and outside air cooling, are water-sensitive. Cooling towers could utilise up to nine litres of water per kWh of energy consumed during peak times.

Secondly, it counts water usage in thermoelectric power plants. These plants generate electricity for data centres and, in return, require water for electricity generation. In America, the average water withdrawal for electricity generation is about 43.8 litres per kWh.

In addition to these two, water is also required to manufacture AI chips. This process can consume millions of litres of water daily, especially in processes like water fabrication, which requires ultrapure water.

AI’s constant thirst seems to be rising

A paper on the study of the economic impact of data centres states the advantages of establishing data centres in regions that may not have stringent regulations. It can facilitate faster and more efficient development since data centres influence economic growth, job creation and competitiveness.

In India, climate change impacts are already prevalent, resulting in increased heat waves and droughts due to water scarcity in states like Rajasthan and Nagaland. This creates concern for water availability in the future for both public and industrial use.

A report by AIM in 2022 had previously presented India’s significant water consumption by data centres in regions facing water scarcity. The study included Pali a region in Rajasthan, where water had to be transported from nearby cities through special trains.

‘According to a 2019 Niti Ayog report, more than 600 million people in India are water-deprived. Also, more than 21 cities, including Chennai, Hyderabad, Delhi and Bangalore, exhausted their groundwater resources in 2021,’ the report stated.

That said, GenAI companies are also adopting measures to promote sustainable development and reduce environmental impact using Green AI, an energy-efficient resource management technique.

Are underwater data centres the solution?

In June 2024, Microsoft officially confirmed the discontinuation of Project Natick, its underwater data centre initiative. This project began in 2013 and aimed to explore the efficiency of submerged data centres powered by renewable energy. By using seawater for cooling, they aimed to make the cooling process efficient. Despite some promising results, Microsoft decided to halt further development.

“We learned a lot about operations below sea level and vibration and impacts on the server. So we’ll apply those learnings to other cases,” said Noelle Walsh, Microsoft’s head of the Cloud Operations + Innovation (CO+I) division.

Many others have followed suit and experimented with building successful submerged data centres to conserve freshwater resources. However, experts believe that efficient resource management still needs to be improved.
Dr Praphul Chandra, professor and director of the Center for AI & Decentralized Technologies at Atria University, Bengaluru says, “Green AI tries to focus on computational techniques to make AI algorithms more energy efficient. The move towards renewable energy helps indirectly, and future commitments on carbon capture from AI companies rely on market-based solutions. We will need efforts on multiple fronts to solve this problem.”

The post AI Chatbots’ Thirst May Kill Our Water Resources appeared first on AIM.

Follow us on Twitter, Facebook
0 0 votes
Article Rating
Subscribe
Notify of
guest
0 comments
Oldest
New Most Voted
Inline Feedbacks
View all comments

Latest stories

You might also like...