The growth of AI has raised concerns on employment and security, but the intense use of water is an overlooked and worrying element.
An example of the enormous water footprint is training GPT-3 in Microsoft’s state-of-the-art US data centres can directly consume 700,000 litres of clean freshwater. This would have tripled if training were done in Microsoft’s Asian data centres. Moreover, training GPT-3 is also responsible for an additional off-site water footprint of 2.8 million litres due to electricity usage this would put GPT-3’s total water footprint at 3.5 million litres in the US.
Another way to look at this is to say that ChatGPT needs to “drink” a 500ml bottle of water for a simple conversation of roughly 20-50 questions and answers, depending on when and where ChatGPT is deployed.
Chatbots utilise servers within data centres to ‘train’ the algorithms or ‘models’ to perform tasks such as answering questions.
The figures come from a new paper from researchers at the University of Colorado Riverside and the University of Texas Arlington: Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models.
The paper concludes that the “enormous” water footprint is a critical concern for socially responsible and environmentally sustainable AI, and there is a need of a methodology to estimate the fine-grained water footprint that can inform the best place to train the models.
“AI models’ water footprint can no longer stay under the radar—water footprint must be addressed as a priority as part of the collective efforts to combat global water challenges.” The paper asserts.
Recent Stories