Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models

Published date:

Resource Type:
Summary:

This peer-reviewed study reveals the hidden water costs of AI, finding that training a single model like GPT-3 can evaporate 700,000 liters of freshwater and projecting that global AI demand could withdraw 4.2–6.6 billion cubic meters of water annually by 2027—more than the total water use of 4–6 countries the size of Denmark. The researchers provide a methodology for estimating AI's water footprint across different locations and seasons, making a case that water consumption must be addressed alongside carbon emissions. Useful for advocates seeking data-backed evidence of data center water impacts and for framing AI sustainability beyond carbon alone.

Author:
Pengfei Li, Jianyi Yang, Mohammad A. Islam, Shaolei Ren
Organization:
arXiv / Communications of the ACM

Comments

Leave a Reply