LCA
The environmental impact of AI: energy, carbon and water in the age of ChatGPT
Key takeaways
AI is driving explosive growth in data centre electricity and water demand, with inference loads multiplying faster than web traffic ever did.
A single AI query now uses about the same electricity as a classic Google search, but the environmental impact depends heavily on the grid and cooling system.
TikTok or Netflix sessions still carry heavier carbon footprints per minute than most AI prompts, but scaling matters: billions of queries per day reshape the baseline.
The big four hyperscalers (Microsoft, AWS, Google, Meta) lead in sustainability metrics, but diverge in transparency, water usage, and carbon offset practices.
Well-designed AI processes can drastically reduce environmental impact compared to traditional methods, as shown by Devera’s 50x lower carbon footprint per LCA.
AI workloads make water the new frontier: several providers now aim to be water-positive by 2030, with zero-water cooling emerging as a key design principle.
AI systems are fast becoming one of the biggest new energy and water consumers in digital infrastructure. While the impact of video streaming and social media has been studied for over a decade, the recent surge in large language models (LLMs) and generative AI introduces a new class of persistent, high-intensity workloads. In this article, we break down the measurable footprint of AI from model training to inference and compare it to other everyday digital activities.
We’ll also take a close look at the data centre strategies behind the world’s largest AI platforms, examining how Microsoft, AWS, Google and Meta are addressing the growing tension between performance, sustainability and transparency.
How much energy does AI actually use?
Global data centre electricity use is expected to reach 945 TWh by 2030, up from 415 TWh in 2024. A large share of this growth comes from the rising demand for AI inference, especially from generative models. Estimates suggest that AI alone may account for 652 TWh (69%) by 2030, nearly an 80-fold jump from 2024 levels.

Table 1: Estimated electricity use by AI (global)
Year | Estimated AI electricity use | Source |
2022 | 23 TWh | IDC |
2030 | 652 TWh (projected) | Utility forecast |
This growth is being fuelled by the rollout of energy-intensive GPUs. A single Nvidia H100 draws up to 700 watts, and an eight-GPU node can reach 5.6 kW. Full racks of AI compute can exceed 240 kW, forcing a redesign of cooling infrastructure and power provisioning in modern data halls.
How much energy and carbon footprint an AI query uses?
This table highlights the stark variability in energy and carbon intensity across leading AI models in 2025, revealing just how much optimization and divergence exists behind each API. OpenAI’s GPT-4o now operates at a lean 0.30 Wh per request, translating to just 0.13 grams of CO₂ on a global-average grid, a dramatic improvement over earlier GPT-4 estimates of 2.90 Wh. Meanwhile, Claude 3 Opus, Anthropic’s premium model, still clocks in at 4.05 Wh per query, making it one of the heaviest public models in active use. On the opposite end, Google’s Gemini 2.0 Flash is remarkably efficient at just 0.022 Wh, nearly an order of magnitude less than GPT-4o. Even Meta’s open-source Llama-3-70B shows a solid mid-range profile at 1.70 Wh. These differences don’t just affect cloud costs, they reshape the environmental equation, especially when scaled across billions of requests. The carbon impact per prompt is no longer a fixed constant, but a design choice tied to model architecture, chip efficiency, and infrastructure.
Running the model: energy and carbon per user request
Model (public API tier, 2025) | Energy per request* (Wh) | CO₂, global grid † (g) |
GPT-4o (OpenAI) | 0.30 | 0.13 |
GPT-4 (earlier estimate) | 2.90 | 1.29 |
Claude 3 Opus (Anthropic) | 4.05 | 1.80 |
Claude 3 Haiku | 0.22 | 0.10 |
Gemini 2.0 Flash (Google) | 0.022 | 0.010 |
Llama-3-70B (Meta, open-source) | 1.70 | 0.76 |
* Figures come from lab or vendor disclosures and assume a “typical” 400-token exchange.
† Global-average grid carbon intensity 445 g CO₂/kWh in 2024 iea.org.
How do AI queries compare to other digital habits?
For inference, OpenAI recently disclosed that a typical ChatGPT query consumes ~0.34 Wh. This aligns AI prompts more closely with Google searches (~0.3 Wh), but well below short-form video or streaming.
Electricity and CO2 footprint of common digital actions (10 units)
Activity | Energy (Wh) | CO2 (g) [global avg] |
ChatGPT x10 queries | 3.4 Wh | 1.5 g |
TikTok x10 min | 66 Wh | 29.2 g |
Google Search x10 queries | 3 Wh | 1.3 g |
Netflix x10 min | 12.8 Wh | 5.7 g |
While a single AI query may consume relatively little energy, the story changes at scale, billions of daily requests quickly add up to significant power demands. That’s why it's crucial to benchmark AI workloads against familiar digital habits. Comparing them to activities like Google searches, video streaming, or social media use helps contextualize their environmental impact and makes the discussion more tangible for users, policymakers and infrastructure planners alike.
What about training? The big hit before deployment
Training large foundation models like GPT-3 or GPT-4 remains the most resource-intensive phase.
Training impact of selected LLMs
Model | Training energy | Estimated CO2 | Cooling water |
GPT-3 | 1.287 GWh | ~552 t CO2e | 5.4 million L |
GPT-4 | ~52–62 GWh (est.) | ~12–15 kt CO2e | Unknown |
While training happens less frequently, its energy and water use per run can rival or exceed some small countries' weekly usage.
Water: the hidden cost of generative AI
Inference workloads also require significant water, primarily for cooling. Estimates show that each ChatGPT query uses 0.32 mL of water, a drop individually, but enormous at scale. Microsoft’s West Des Moines campus, for example, consumes ~43 million L/month in summer, representing 6% of the city's total draw.
Researchers project that AI-driven data centres could withdraw up to 6.6 billion m3 of freshwater by 2027, more than four times Denmark’s annual usage.
Global outlook
AI could withdraw 4.2 – 6.6 billion m³ of freshwater in 2027, evaporating 0.38 – 0.60 billion m³ outright. That’s more than the annual freshwater withdrawal of four to six Denmarks.
Case study: Microsoft in West Des Moines
The company’s Iowa campus peaks at 11.5 million gallons/month (~43 million L) during summer, about 6% of the city’s total usage.
Corporate response
To tackle this, Microsoft now implements a closed-loop cooling system that eliminates evaporative losses, saving >125 million L per datacentre per year.
Data centers: the cloud wars go green
The four major hyperscalers Microsoft Azure, Amazon Web Services (AWS), Google Cloud, and Meta, present the most comprehensive and measurable sustainability strategies in the market. Other providers (Oracle Cloud, IBM Cloud, etc.) share similar goals but currently publish fewer operational metrics or have set less ambitious climate targets. mid‑2025.
Let’s compare the main hyperscalers by sustainability metric.
Sustainability snapshot of major AI/cloud providers (2025)
Provider | Carbon Goal | Renewable Energy | Power Usage Effectiveness (PUE) | Water Goal | Water Usage Effectiveness (WUE) |
Microsoft | Carbon-negative 2030 | 90% (target: 100% 2025) | 1.125 | Water-positive 2030 | ND (no water in new designs) |
AWS | Net-zero 2040 | 100% since 2023 | 1.15 | Water-positive 2030 (53% progress) | 0.15 L/kWh |
Net-zero 2030 | 100% annually, 64% CFE 24/7 | 1.10 | Replenish 120% of water | ~1 L/kWh | |
Meta | Net-zero 2030 | 100% since 2020 | 1.08 | Replenish 200% in high-stress areas | 0.20 L/kWh |
Microsoft Azure
Climate roadmap: Carbon negative, water positive, and zero waste by 2030; 100% renewable electricity by 2025.
Operational efficiency: Average PUE of 1.125 in next-gen data centers, outperforming the industry average of 1.4–1.6.
Innovations:
– Waterless cooling and liquid immersion systems that cut GHG emissions by 15–21% and reduce water usage by 31–52%.
– Hydrogen fuel cells for 3 MW zero-emission backup power.
Challenges: Despite water-saving designs, rising water consumption in Texas has led to local opposition.
Amazon Web Services (AWS)
Energy: Reached 100% renewable electricity matching in 2023; net-zero target for 2040.
Efficiency: Global PUE of 1.15 (best site at 1.04).
Water: Water Positive by 2030 initiative; global WUE of 0.15 L/kWh (down 40% since 2021).
Innovations:
– Use of recycled water and community water recharge projects; reached 53% of 2030 water goal by 2024.
– Investment in modular nuclear reactors to decarbonize electricity in low-renewable regions.
Challenges: Heavy reliance on Renewable Energy Certificates (RECs) and surging AI workloads complicate real emission reductions.
Google Cloud
“24/7 Carbon-Free Energy (CFE)” strategy: Operate every data center with hourly carbon-free energy by 2030; reached 64% globally in 2024 (10 regions at ≥90%).
Efficiency: Average PUE of 1.10; publishes detailed campus-level data.
Water: Commitment to replenish 120% of freshwater consumed and improve local watersheds.
Innovations & challenges:
– Secured geothermal, battery, and nuclear contracts to support 24/7 CFE goals.
– AI-driven growth raised water footprint 17% in 2023, sparking transparency concerns.
Meta (Facebook)
Energy: 100% renewable since 2020; net-zero across full value chain by 2030.
Efficiency: PUE of 1.08 and WUE of 0.20 L/kWh; all facilities LEED Gold certified.
Water: Return 200% of water in high-stress regions, 100% in medium-stress regions.
Innovations: 150 MW geothermal agreement in New Mexico; exploring micro‑nuclear for AI workloads.
Challenges: Discrepancy between location-based and market-based emissions raises questions about RECs' actual impact.
Other Providers
Provider | Key Sustainability Highlights |
Oracle Cloud Infrastructure (OCI) | 100% renewables by 2025; net-zero by 2050; PUE of 1.30 in most efficient region. Uses rain capture and xeriscaping to reduce water needs. |
IBM Cloud | 75% renewables by 2025 (90% by 2030), net-zero by 2030. Promotes on-site solar and water management through its Sustainability Accelerator. |
Emerging Hyperscale/Colocation | The EU’s Climate Neutral Data Centre Pact targets PUE ≤ 1.3 and water use < 2.5 L/kWh by 2025. Providers like Switch and CyrusOne already deploy zero-water cooling systems. |
Key sustainability highlights of Data Centers:
Microsoft and Google lead with the most aggressive climate roadmaps (carbon negative / 24/7 CFE) and innovations in waterless cooling and carbon-free backup power.
AWS excels in rapid renewable deployment and was the first to launch a water-positive target, though its net-zero goal is set a decade later (2040).
Meta offers the best operational PUE–WUE ratios and runs entirely on renewables, but its explosive AI expansion strains sustainability performance.
Oracle, IBM, and others are progressing but show less transparency and have longer-term goals.
Environmental impact of LCA AI-powered with Devera
Each product footprint calculated with Devera avoids hours of manual consulting, unnecessary meetings, and carbon-heavy travel. But it also triggers:
A range of 0.03M to 0.4M tokens processed for each LCA.
Several seconds of high-intensity computation.
And a short but meaningful spike in energy usage inside a data center, typically powered by AWS or Microsoft Azure.
To be clear, Devera’s footprint is small compared to massive AI labs, but the direction matters. If we're not careful, we could end up solving environmental problems while contributing silently to another one.
First estimations to compare traditional LCA with Devera LCA process (to be improved)*

From our assumptions, LCA from a traditional LCA consultant requires more than 50 times more energy than LCA AI-based with Devera.
Conclusion: AI is reshaping the sustainability equation
The environmental footprint of AI is complex, dynamic and still evolving. While per-query impacts are trending downward thanks to more efficient chips and smarter infrastructure, the sheer scale of demand poses serious long-term risks. Leaders like Microsoft and Google are pioneering zero-water cooling and hourly carbon tracking, but the road to net-zero AI will require deeper transparency, stricter regulation, and better workload design.
If AI is going to be the new electricity, then it must also follow the same rules: measured, monitored and decarbonised at source.