The rapid scaling of artificial intelligence models has significant implications for energy use, water consumption, and local infrastructure. AI data centers require vast computational power for both training and deployment, placing increasing strain on electrical grids, water systems, and surrounding communities. As AI adoption accelerates, these impacts are becoming more concentrated at the local level, raising concerns about sustainability, public resource use, and community oversight. Training massive AI models requires months of continuous operation on large data center clusters, creating unprecedented and sustained energy demand that far exceeds traditional digital infrastructure.
Key Facts
- Generative AI text tasks require approximately 10 times more electricity than a traditional Google search.
- AI image generation is significantly more energy-intensive, using about 62 times more energy than text generation per 1,000 queries.
- Modern AI models such as ChatGPT are estimated to contain 1.8 trillion parameters, requiring months of continuous training on specialized hardware.
- AI’s total energy footprint is split between model development and training at roughly 40% and deployment and inference at roughly 60%.
- Training OpenAI’s GPT-3 model alone consumed an estimated 1,300 megawatt-hours of electricity.
- AI data centers rely heavily on energy-intensive Graphics Processing Units (GPUs) and advanced cooling systems, making them far more power-hungry than traditional data centers.
- GPT-3 Training Water Consumption: The initial "pre-training" of GPT-3 (the 34-day period in 2020 required to create the model) consumed an estimated 180,000 gallons (700,000 litres) of freshwater. This high-intensity usage was driven by the evaporative cooling systems needed to prevent thousands of high-performance GPUs from overheating—a "birth cost" that is distinct from the ongoing water footprint of daily user interactions (inference).
- During deployment, AI systems can consume the equivalent of a 16-ounce bottle of water for every 10 to 50 responses.
- The United States hosts 54% of global data center capacity, reflecting its central role in AI infrastructure expansion.
- Northern Virginia is widely referred to as the data center capital of the world, while Pennsylvania has seen more than 25 hyperscale facilities proposed in a single year.
- Hyperscale data centers typically contain 5,000 or more servers, exceed 10,000 square feet, and require over 100 megawatts of power, comparable to the electricity demand of a small city.
- The U.S. hosts approximately 40–50% of global data center capacity, driven largely by the AI race with China.
- GPU-intensive AI clusters (e.g., NVIDIA H100s) consume 10–20 MW per facility, with 40–80 kW per rack, far exceeding traditional servers.
- Training leading AI models can take months, even using massive supercomputers or distributed data center clusters.
- Training GPT-4 alone is estimated to have consumed ~50 GWh of electricity.
- AI inference now accounts for ~40% of overall data center power growth, compounding long-term energy demand.
Frequently Asked Questions (FAQs)
- Q: Why does AI require so much energy compared to other digital services?
A: AI models rely on massive computational workloads, particularly during training and inference, which require specialized processors and continuous power usage far beyond standard web services.
- Q: How does AI model training affect local water resources?
A: AI data centers use large volumes of water for cooling, which can strain local water supplies, especially during droughts or peak demand periods.
- Q: What power sources typically support AI data centers?
A: While operators often pledge renewable sourcing, the massive, "always-on" power requirements of AI have led to a resurging dependence on fossil fuels. Key drivers of this shift include:
- Delayed Coal Retirements: Utilities in states like Georgia, Wisconsin, and Nebraska have canceled or postponed the retirement of at least 15 major coal-fired plants to meet the surge in data center demand. In some cases, the U.S. Department of Energy has issued emergency orders to keep aging coal units online to prevent grid failure.
- Natural Gas Surge: To provide 24/7 "baseload" power that wind and solar cannot yet guarantee, developers are increasingly turning to fracked natural gas. Over 100 GW of new gas-fired capacity is currently planned, which could increase power sector emissions by up to 40% in some regions.
- Policy Hurdles: Recent shifts in federal policy—including the repeal of renewable tax credits and a national security-based pause on major offshore wind permits—have slowed the construction of new clean energy infrastructure. This has forced data centers to draw from an increasingly carbon-intensive grid or rely on on-site diesel and gas generators.
- Q: What are the risks to local infrastructure?
A: The scale of AI data centers often forces a "resource tug-of-war" between trillion-dollar tech companies and local communities:
- Cost-Shifting to Residents: Massive grid upgrades—including new high-voltage transmission lines and substations—are frequently subsidized by local ratepayers. In many regions, residential electricity bills are rising to fund the infrastructure specifically required to support high-density AI clusters.
- Aquifer Depletion: AI cooling requires millions of gallons of water daily. In water-stressed areas, data centers draw heavily from local aquifers and municipal supplies, lowering the water table and potentially outcompeting residential and agricultural needs during droughts.
- Infrastructure Strain: The "always-on" nature of AI demand can lead to "curtailment" events, where local utilities must prioritize data center stability over residential grid reliability during peak summer or winter months.
- Q: How can communities evaluate proposed data center projects?
A: Evaluating these projects is often a defensive and uphill battle for local residents. Because tech companies frequently use non-disclosure agreements (NDAs) and "shell" company names during the planning phase, communities must often take the following aggressive steps to gain transparency:
- Overcoming Informational Barriers: Critical data regarding water rights, peak energy draw, and noise pollution is rarely volunteered. Residents often must rely on Freedom of Information Act (FOIA) requests, "Right to Know" filings, and investigative pressure to see the true scale of the impact.
- Challenging "Black Box" Agreements: Many projects are fast-tracked through local boards before the public is even notified. Communities are increasingly demanding that zoning changes and tax incentives be paused until a full environmental and grid-impact study is released.
- Securing Enforceable Commitments: Rather than accepting vague "green" promises, communities are fighting for legally binding Community Benefit Agreements (CBAs) that include ironclad limits on aquifer drawdown and mandatory contributions to local grid infrastructure—ensuring residents don't end up subsidizing corporate utility bills.
- Q: Who benefits most from hyperscale data center expansion?
A: Large technology corporations such as Amazon, Meta, Google, and Microsoft are the primary drivers and beneficiaries, while local communities often absorb environmental and infrastructure impacts.
Resources/ Sources
- Bad Data Centers- Scale
- Bad Data Centers- Energy
- Kasia Tarczynska: What to Ask When a Data Center Wants to Come to Town. Key questions that public officials and community members should ask about a proposed data center — including zoning, pollution, water use, jobs, tax incentives, and transparency — so that impacts are understood and addressed long before approval.
- NAACP: Data Centers Impact on Energy Demand. Highlights how rapidly expanding data centers built by major tech companies increase energy and water demand, exacerbate greenhouse gas emissions and environmental burdens—especially in frontline communities—while calling for greater transparency, impact assessments, and community benefits to address these harms.
- Shehabi, A.; Newkirk, A.; Smith, S.; Hubbard, A.; Lei, N.; Siddik, M., et al. (2024). United States Data Center Energy Usage Report. The Energy Act of 2020 requires the Department of Energy to report on U.S. data center electricity use from 2014 and project future demand through 2028.
- Sierra Club: Demanding Better: Holding Tech Companies Accountable. Urges large energy users like data centers to push utilities and policymakers to meet rising electricity demand with clean energy and stronger grid planning instead of increased fossil fuel use.
- Chesapeake Climate Action Fund and Global Strategy Group: New Polling on Virginians Views of Data Centers and Rising Energy Costs. Voters believe data centers are driving up electricity costs and environmental impacts, and they want lawmakers to take action instead of allowing these costs to be passed on to families and small businesses.
- As You Sow: Compute and Consequence: AI Energy Demand in a Rapidly Evolving Grid Landscape. Rapid growth of AI data centers is driving big increases in electricity demand and fossil fuel infrastructure, risking higher emissions and grid stress unless clean energy planning and transparency improve.
- Dr. Margaret Cook: Thirsty Data: The Hidden Water and Energy Costs of Texas' Data Center Boom – Houston Advanced Research Center. Texas's rapid data center growth is putting huge, largely unplanned pressure on the state's water and energy supplies, with current planning processes failing to account for their rising water use and cooling demands.