Due to the growing presence of AI-powered chatbots and different AI models within our daily lives, as this technology moves from prototype to product at a dizzying pace, it is becoming almost impossible to turn a blind eye to their environmental impacts, especially now that this technology is being integrated into workflows across different fields.
However, if the carbon footprint of large AI models such as ChatGPT-3 has received wide public attention, the critical issue of their clean freshwater consumption to cool down the data centers’s equipment, which in times of water scarcity is becoming a fair concern, has remained relatively understudied until very recently.
In 2023, a study conducted at the University of California, Riverside, based on data provided by OpenAI in 2020 and titled Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models, showed that GPT-3 consumed a 500ml bottle of water for roughly 10-50 responses, with variations spatially and temporally. It also found that training GPT-3 in Microsoft’s U.S. data centers could consume a total of 5.4 million litres of water, figures that, according to researchers at the time, could increase with the launch of GPT-4 which has a substantially larger model size.
A recently published article by The Times, quoting Shaolei Ren, a professor and engineer at the University of California and part of the team of academics involved in the 2023 study, seems to outline a far worse scenario. According to Ren, data provided in September by a new Microsoft report suggests that potable water consumption is far higher than the previously estimated half a liter for 10-50 queries. The water footprint of ChatGPT could therefore be up to four times higher than predicted, making it increasingly urgent for AI companies to optimize their models and algorithms and distribute workloads to locations with greater water efficiency.
Opening image: photo by Jonathan Kemper on Unsplash