Search powered by Algolia
RAIL Knowledge Hub
Governance
The carbon cost of intelligence: AI's environmental footprint

The carbon cost of intelligence: AI's environmental footprint

The environmental impact of training and running large AI models -- carbon emissions, water usage, and energy consumption.

governanceNov 11, 2025·17 min read·RAIL Team

Overview

AI environmental impact breakdown

AI systems may already possess a carbon footprint matching New York City's annual emissions and water consumption approaching "the world's total annual consumption of bottled water." With electricity demand projected to double by 2030, environmental costs have shifted "from a niche concern to a defining sustainability challenge."

The Numbers Behind the Boom

The generative AI expansion has sparked unprecedented infrastructure development. OpenAI and the Trump administration announced Stargate, targeting "$500 billion on up to 10 data centers." Apple committed "$500 billion in US manufacturing and data center investment over four years," while Google projected "$75 billion on AI infrastructure in 2025 alone."

From 2005-2017, data center electricity remained relatively flat despite cloud computing and streaming growth due to efficiency gains. AI disrupted this pattern. The International Energy Agency estimates "global data center electricity demand will more than double by 2030, reaching approximately 945 terawatt-hours" -- exceeding Japan's total consumption and making data centers the planet's fifth-largest electricity consumer.

Carbon: The Climate Cost

Transparency limitations complicate precise measurements. A December 2025 VU Amsterdam study by Alex de Vries-Gao estimated "AI systems' carbon footprint in 2025 fell between 32.6 and 79.7 million tons of CO2" -- comparable to Norway's annual emissions.

A November 2025 Nature Sustainability study projected "US AI server deployment alone could generate 24 to 44 million tons of CO2-equivalent annually between 2024 and 2030." Researchers concluded the industry is "unlikely to meet its net-zero aspirations by 2030 without substantial reliance on highly uncertain carbon offset and water restoration mechanisms."

Goldman Sachs Research indicates "approximately 60% of the increasing electricity demand from data centers will be met by burning fossil fuels," potentially adding "roughly 220 million tons to global carbon emissions." Cornell researchers identify the "rebound problem" -- total emissions rising when AI demand exceeds grid decarbonization rates.

Embodied carbon -- emissions from construction, GPU manufacturing, and raw material extraction -- remains underestimated. Data centers require "10 to 50 times more energy-intensive per square foot than typical commercial buildings," constructed from "tons of steel and concrete."

Water: The Hidden Cost

Data centers consume vast water quantities for cooling and electricity generation. The VU Amsterdam study estimated "AI's water footprint in 2025 could range from 312.5 to 764.6 billion liters," equivalent to annual bottled water consumption globally.

Individual facilities consume staggering amounts. Some "consume up to 5 million gallons per day -- equivalent to a small town's total daily water use." Siting decisions concentrate problems in water-stressed regions. Northern Virginia data centers already "consume 26% of the state's electricity," straining local infrastructure.

Regional Hotspots

Environmental impacts concentrate where data centers cluster. Dublin facilities consume "approximately 79% of the city's electricity." Ireland's nationwide data center share could reach "32% by 2026." Virginia represents "the densest data center market in the world," with facilities consuming "26% of state electricity."

"Roughly half of all US and Japanese power demand growth over the next five years is expected to come from data centers." Concentrations create cascading effects -- when demand exceeds supply, companies resort to fossil fuels. Elon Musk's X supercomputing center near Memphis allegedly used "dozens of methane gas generators" without regulatory approval.

The Per-Query Perspective

Individual query efficiency improved dramatically. Google reports "a 33-fold reduction in energy and 44-fold reduction in carbon for the median AI prompt between 2024 and 2025." A single Gemini-class query uses "approximately 0.24 watt-hours of energy and 0.26 milliliters of water -- comparable to watching about nine seconds of television."

However, scale matters. Billions of daily prompts combined with training, manufacturing, and end-of-life processes aggregate into significant system-level impacts. As AI integrates into "search, email, document editing, customer service," users increasingly consume resources without conscious choice.

The Roadmap to Sustainable AI

A November 2025 Cornell University-led Nature Sustainability study outlined mitigation strategies reducing "AI's carbon emissions by approximately 73% and water use by approximately 86% compared to worst-case scenarios."

Smart Siting: Location as the First Lever

Location represents the single most impactful factor. Facilities in regions with "low water stress and clean electricity grids could slash water demands by about 52% on its own." Cornell identified "the US Midwest and 'windbelt' states -- Texas, Montana, Nebraska, and South Dakota -- as offering the best combined carbon-and-water profile." New York's "clean electricity mix (nuclear, hydropower, and growing renewables)" also suits data centers.

Professor Fengqi You stated: "This is the build-out moment. The AI infrastructure choices we make this decade will decide whether AI accelerates climate progress or becomes a new environmental burden."

Grid Decarbonization: Matching Clean Energy to AI Demand

Optimal siting alone cannot compensate for carbon-intensive grids. In Cornell's high-renewables scenario, grid decarbonization reduced CO2 by "approximately 15% compared to baseline," yet "roughly 11 million tons of residual emissions remained, requiring an additional 28 gigawatts of wind or 43 gigawatts of solar capacity to reach net zero."

Tech companies invest heavily in clean energy. By 2022, "US technology companies had already contracted over 35 gigawatts of renewable electricity through power purchase agreements." Microsoft signed "a 10 GW renewable energy deal with Brookfield and a 0.8 GW nuclear deal with Constellation Energy." Renewable expansion must accelerate to match AI demand growth.

Operational Efficiency: Doing More with Less

Technical improvements offer gains. "Advanced liquid cooling can reduce water use by 29%." Server utilization improvements, workload scheduling (shifting computation when grids are cleanest), and efficient models contribute. Google's achievement demonstrates possibilities.

Model-level techniques like "knowledge distillation, quantization, and architecture search can dramatically reduce computational costs." MIT's Neil Thompson noted: "Making these models more efficient is the single most important thing you can do to reduce environmental costs of AI."

Combined approaches achieve "approximately 73% carbon reduction and 86% water reduction -- but only if pursued simultaneously and at scale."

The Transparency Gap

No major tech company reports "AI-specific environmental metrics," despite acknowledging AI's role in increasing consumption. Without such data, "researchers must rely on approximations, and regulators cannot effectively oversee environmental compliance."

The VU Amsterdam study called for "new policies mandating disclosure of AI-specific metrics, including locations where AI systems operate, scale of operations at each site, and water usage effectiveness values for individual facilities." Without transparency, environmental impact remains "largely hidden from public view."

What Comes Next

Research identifies several priorities:

Mandate Environmental Disclosure: Regulators should require companies to report energy consumption, carbon emissions, and water use disaggregated by facility and workload. The EU AI Act's environmental provisions offer a starting point.

Integrate Environmental Criteria into Siting Decisions: Planning authorities should factor water stress, grid carbon intensity, and renewable energy availability into permitting, rather than allowing market forces alone to determine location.

Accelerate Clean Energy Deployment: Renewable transition must match AI demand growth -- not only contracting capacity but ensuring timely grid connection.

Invest in Algorithmic Efficiency: Research funding should prioritize computational efficiency, reducing training and inference costs. Google's annual 33-fold improvement demonstrates potential.

Protect Communities: Environmental and economic costs -- water consumption, electricity price increases -- fall disproportionately on local communities. Equity must inform every planning stage.

Conclusion

The AI industry faces a crossroads. Infrastructure decisions regarding location, power sources, and cooling "will determine whether artificial intelligence accelerates the transition to a sustainable economy or becomes a significant new burden on the planet's climate and water resources."

The pathway exists. Smart siting, grid decarbonization, and operational efficiency "can together cut AI's environmental footprint by roughly three-quarters." Success requires "the same urgency, coordination, and accountability that the industry brings to building the AI systems themselves. The carbon cost of intelligence is real -- but it is not yet inevitable."

References

  1. IEA (2025). Global Energy Review 2025.
  2. Nature Sustainability (2025). "Environmental impact and net-zero pathways for sustainable AI servers in the USA." Nov 10.
  3. de Vries-Gao, A. (2025). "The carbon and water footprints of data centers and what this could mean for AI." ScienceDirect, Dec 17.
  4. Cornell Chronicle (2025). "'Roadmap' shows the environmental impact of AI data center boom." Nov.
  5. MIT News (2025). "Explained: Generative AI's environmental impact." Jan 17.
  6. MIT News (2025). "Responding to the climate impact of generative AI." Sept 30.
  7. MIT Technology Review (2025). "We did the math on AI's energy footprint." May 20.
  8. Carbon Brief (2025). "AI: Five charts that put data-centre energy use - and emissions - into context." Sept.
  9. Euronews (2025). "AI data centres could have a carbon footprint that matches small European country." Dec 20.
  10. Carbon Direct (2025). "Understanding the carbon footprint of AI and how to reduce it."
  11. Online Learning Consortium (2025). "The Real Environmental Footprint of Generative AI." Dec 4.