The use of Artificial Intelligence (AI) is increasing in the modern world and among a multitude of concerns raised have been the environmental impacts of such technology. There has been a lot of public debate about the energy consumption of AI data centres, yet there has been almost no discussion about their water consumption – until now.
What are AI data centres?
Artificial Intelligence (AI) data centres are the physical facilities that house the enormous computing resources and storage that enable chatbots like ChatGPT to learn and remember information. In many parts of the world data centres are housed in vast warehouses, and these have to be fed with electricity 24 hours a day, with internal climates tightly controlled to allow the technology to perform at an optimum level.
As such, data centres have attracted negative media attention for the vast amount of energy they consume. It is estimated that collectively that AI data centres use 2% of all electricity generated globally1 .
However, less well known is the amount of water these facilities use. According to a new paper by researchers from the University of California Riverside and the University Texas Arlington, ‘Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models’, it is estimated that a conversation of between 20 and 50 questions with an AI chatbot could consume 500ml of water. This may not seem all that much, but the report also suggested that in 2021, Google’s US based, self-owned AI data centres alone used an estimated 12.7 billion litres of water for on-site cooling, 90% of which was potable.
Why do data centres use so much water?
AI data centres consume water in two ways – directly and indirectly – and consume the most during the ‘training’ phase: the period in which AI is fed data and programmed how to respond.
Indirect consumption is the water that is used off-site in the generation of power. An example of this could be the water used in the cooling towers of coal-fired power stations. Although AI data centres do not directly consume this water, their enormous demand for power means that to keep the centres running all day, more water is used in comparison with other types of facilities.
Direct water consumption is the water that is used on-site by the data centre for its own cooling purposes. Almost all of the power put into data centre servers is converted into heat and the sheer size of the units only exacerbates this, so in order to maintain stable functional temperatures water is passed through a heat exchange system to cool the equipment and stop it overheating.
Water for cooling purposes is also needed to respond to changes in external temperature caused by seasonal weather patterns. This is of particular concern considering the increasing frequency of hotter summers and the resulting water scarcity that is experienced across the globe. As a result, there is the potential implication that data centres are consuming water that should be reserved for at risk communities. In the summer of 2022 Thames Water raised concerns about data centres around London using potable water for cooling during a drought2. This problem is only likely to get worse as the climate crisis deepens and weather patterns become more unpredictable.
The water consumption for US-based facilities may be striking but the research in ‘Making AI Less “Thirsty”’ also suggests that less technologically advanced data centres in Asia could use as much as three times the amount of water as their western equivalents. This is worrying considering that by 2030 it is predicted that half of the world’s population will face severe water stress.
Two important caveats
Whilst this new research paper is compelling, it is important to remember that data on this subject is scarce; partly because little research has been undertaken to discover the water usage of data centres, and partly because – as the paper suggests – tech companies are keen to hide the true cost of AI technology. It is also important to consider that, whilst this report has had wide coverage in the press, this research and its methodology has – at time of writing in May 2023 – not been peer reviewed. Whilst this does not undermine the potential importance of the research it does mean that further investigation will be needed before we can conclusively hang damning statistics around the neck of all AI data centres.
What can be done about it?
In the meantime, the researchers behind ‘Making AI Less “Thirsty”’ suggest that when and where AI chatbots are trained has a massive impact on the amount of water they use. The importance of geographic location in determining the sites for data centres should therefore be a top priority for tech companies, as should the timing of the training process in the calendar year.
However, measures can also be implemented to reduce a data centres’ cooling water usage regardless of location. For example, VWT UK’s SIRION™ RO units offer high flux, low energy reverse osmosis (RO) membrane technology. VWT UK can design the water treatment system to recycle and reuse reject water and in some cases treat and recycle the blowdown water from the coolers, – all whilst saving electrical power compared with a conventional unit. Enabling data centres to operate more sustainably and helping to reduce operating costs.
Our water treatment experts can visit facilities to help improve the efficiency of an existing system or help clients design a new one. Their expertise can ensure that systems run optimally with minimal waste and the reduced likelihood of outages.
For more information about the solutions we can provide for data centre facilities, please click here.
1 Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models - https://arxiv.org/abs/2304.03271
2 Thames Water reviews data centres’ water use as London hosepipe ban looms - https://www.ft.com/content/8d8bf26f-5df2-4ff6-91d0-369500ed1a9c