close
close

The thirst of AI chatbots can kill our water resources

The thirst of AI chatbots can kill our water resources

Think about it: you might need half a liter of water to cook two packets of Maggi instant noodles. Now, what if we told you that a single online chat with AI tools like ChatGPT consumes the same amount of water? It may seem small, but when millions of people use chatbots daily, it becomes a lot and increases the combined water footprint.

According to a study on the water consumption of ChatGPT by A. Shaji George, an expert in Information and Communication Technologies (ICT), the AI ​​chatbot consumes 0.5 liters of water during each of its long conversations with a user. This applies to all existing AI and LLM systems.

Another research study, “Secret Water Footprint of AI Models,” by Pengfei Li and Shaolei Ren of the University of California, Riverside, predicts that the pressure AI workloads place on AI resources fresh water

The 2027 projection shows that, given concerns about global water shortages, global demand for AI would lead to amounts of water withdrawal – fresh water taken from underground or surface water sources, either temporarily or permanently . This withdrawal would be somewhat equal to the total annual water withdrawal of countries such as Denmark and half of the UK, ie between 4.2 billion and 6.6 billion cubic metres.

This is especially significant when you consider the annual human consumption of fresh water, which is about 4 trillion cubic meters, according to the World Water Development Report of the United Nations and the Organization for Agriculture and Food (FAO).

AI training data centers are responsible for a significant portion of water use. According to the research paper, even after excluding water use by third-party facilities, “Google’s stand-alone data centers alone directly withdrew 25 billion liters and consumed nearly of 20,000 million liters of water for on-site cooling in 2022, most of which was potable water”.

How does AI use water?

Water use by AI systems is classified into three areas. Initially, it includes water consumption for data centers hosting AI workloads. These centers contain high-performance servers and require significant cooling agents due to the continuous generation of heat. Most of your cooling systems, such as towers and outside air cooling, are sensitive to water. Cooling towers could use up to nine liters of water per kWh of energy consumed during peak hours.

Secondly, it counts water consumption in thermoelectric plants. These plants generate electricity for the data centers and in return they need water for the electricity generation. In America, the average water withdrawal for electricity generation is about 43.8 liters per kWh.

Besides these two, water is also needed to make AI chips. This process can consume millions of liters of water daily, especially in processes such as water manufacturing, which requires ultrapure water.

AI’s constant thirst seems to be increasing

An article on the economic impact study of data centers outlines the benefits of establishing data centers in regions that may not have strict regulations. It can facilitate faster and more efficient development as data centers influence economic growth, job creation and competitiveness.

In India, the impacts of climate change are already common, leading to increased heat waves and droughts due to water scarcity in states such as Rajasthan and Nagaland. This creates concern about the availability of water in the future, both for public and industrial use.

An AIM report in 2022 had earlier presented India’s significant water consumption by data centers in regions facing water scarcity. The study included Pali, a region in Rajasthan, where water had to be transported from nearby towns via special trains.

“According to a 2019 Niti Ayog report, more than 600 million people in India are deprived of water. In addition, more than 21 cities, including Chennai, Hyderabad, Delhi and Bangalore, have exhausted their water resources groundwater by 2021,” the report states.

That said, GenAI companies are also taking steps to promote sustainable development and reduce environmental impact by using Green AI, an energy-efficient resource management technique.

Are underwater data centers the solution?

In June 2024, Microsoft officially confirmed the discontinuation of Project Natick, its underwater data center initiative. This project started in 2013 and aimed to explore the efficiency of submerged data centers powered by renewable energy. By using seawater for cooling, they intended to make the cooling process efficient. Despite some promising results, Microsoft decided to halt further development.

“We’ve learned a lot about subsea operations and server vibrations and impacts. So we’re going to apply those learnings to other cases,” said Noelle Walsh, head of cloud operations + innovation (CO+ I) from Microsoft.

Many others have followed suit and experimented with successfully building submerged data centers to conserve fresh water resources. However, experts believe that the efficient management of resources still needs to be improved.

Dr Praphul Chandra, Professor and Director, Center for AI and Decentralized Technologies, Atria University, Bengaluru, says, “Green AI tries to focus on computational techniques to make AI algorithms more energy efficient . The move to renewables helps indirectly, and future commitments on carbon capture by AI companies are based on market-based solutions. We will need efforts on multiple fronts to solve this problem.”