The AI energy shock: Why hyperscalers are no longer easy to decarbonise

Artificial intelligence (AI) is powering a digital revolution – and it’s hungry for power. As AI workloads move from experimentation to full-scale deployment, the infrastructure behind them is demanding vastly more power than traditional cloud computing ever did.

 

Hyperscalers – the giants of cloud computing – were once seen as relatively straightforward to decarbonise. Their size allowed them to procure renewables directly and run energy-efficient data centres. But the AI boom has upended that narrative. Surging compute demand is driving a steep rise in electricity use, forcing these companies to rethink their energy strategies and putting climate targets under serious pressure.

 

According to the International Energy Agency (IEA), global electricity consumption from data centres is projected to double by 2030, reaching around 945 terawatt-hours (TWh) – nearly 3% of global electricity use. Between 2024 and 2030, data centre electricity consumption is expected to grow by around 15% annually, over four times faster than other sectors combined.

 

In the U.S. data centres could account for 8.6% of total U.S. electricity demand by 2030 according to BloombergNEF.

 

This exponential shift raises a critical question: can the clean energy system scale fast enough to keep up?

 

Who will power the AI boom?

 

To meet this surging demand, while also meeting their climate commitments, tech giants are racing to secure clean, reliable electricity. Solar, wind, and hydro remain central to most strategies, and companies like Microsoft, Google, and Amazon signing massive power purchase agreements (PPAs) to fund renewable infrastructure.

 

However, renewables alone may not be enough. AI data centres require 24/7 power availability – often in regions that lack strong renewable generation. That’s pushing some hyperscalers to explore nuclear options, particularly small modular reactors (SMRs), which offer high availability and can be sited closer to data centre campuses.

 

Yet, significant obstacles remain. In many regions, permitting delays, transmission bottlenecks, and political resistance are slowing clean energy deployment. In the U.S., for example, interconnection queues for new wind and solar projects can exceed five years.

 

Where clean energy can't be scaled fast enough, some operators are turning – or returning – to fossil fuels, sometimes with carbon capture. This risks undercutting corporate climate commitments and broader decarbonisation efforts.

 

What's slowing down data centre projects?

Source: Schneider Electric, AlphaStruxure and Data Center Frontier, 2025

 

Net-zero ambitions under pressure

 

The AI boom risks creating a “demand trap” – where absolute fossil fuel generation increases despite record renewable additions, simply because clean supply can’t keep up with runaway demand.

 

The IEA estimates that renewables will meet nearly half of the global data centre electricity demand from now until 2030. The rest will be met by natural gas and coal – unless hyperscalers, regulators, and grid operators can shift the trajectory.

 

Sources of global electricity generation for data centres, Base Case, 2020-2035

Source: IEA, April 2025.

 

This puts pressure on tech companies with bold climate pledges. Microsoft and Google aim for net-zero emissions by 2030, while Amazon and Meta target 2040. But many of these goals were set before the AI boom.

 

There’s also a credibility gap in emissions reporting. Many data centres use market-based accounting, which reflects contractual clean energy purchases – such as Renewable Energy Certificates (RECs) and virtual Power Purchase Agreements (PPAs) – rather than the actual carbon intensity of the grid electricity used.

 

This can dramatically understate true emissions. Unbundled RECs or virtual PPAs let companies claim “green power” without physically sourcing it, which may do little to displace fossil generation. Critics argue these instruments often lack additionality and may simply reshuffle clean electrons on paper.

 

The AI land grab for clean energy

 

Even if data centre efficiency improves, total emissions could still rise unless grids decarbonise in parallel. If hyperscalers consume a disproportionate share of new renewable capacity, other sectors – like manufacturing and transport – could be left with more carbon-intensive electricity, slowing broader progress toward net-zero.

 

But there’s another side to the story. A study by the Grantham Research Institute and Systemiq finds that AI could reduce global greenhouse gas emissions by 3.2 to 5.4 billion tonnes annually by 2035, potentially outweighing increases from global power consumption of data centres and AI.

 

AI could drive emissions reductions across five key areas:

  1. Optimising complex systems (like energy grids)
  2. Accelerating innovation (e.g., in clean tech)
  3. Influencing behaviour (through smarter decision-making)
  4. Enhancing climate modelling and policy
  5. Improving resilience and adaptation

 

In sectors like power, transport, and food – which account for about 50% of global emissions – AI can significantly boost efficiency, increasing renewable energy output and accelerating the adoption of alternative proteins.

 

The bottom line

 

Electricity is becoming the scarcest input in the AI economy. The perception that hyperscalers were easy to decarbonise is quickly fading. Their soaring energy demand is testing the limits of clean energy supply, grid infrastructure, and emissions accountability.

 

Whether hyperscalers catalyse or complicate the net-zero pathway depends on the choices made now. Procurement strategies, siting decisions, and grid partnerships formed this decade will shape the emissions profile and the competitive edge of the digital infrastructure powering AI.

 

The world’s digital future may be built on AI – but whether it’s clean or carbon-intensive depends on how quickly the energy system can evolve.

 

 

Disclaimers

Author

Related Articles