AI is often framed as a silver bullet for the world's biggest problems — including climate change. But the story is more complicated. Training a single large AI model can emit as much carbon as five cars over their entire lifetimes. At the same time, AI is helping scientists discover new materials for batteries, optimise entire electricity grids, and model the climate with unprecedented accuracy.
So which is it — problem or solution? The honest answer is: both.
Modern AI models are power-hungry. Training large language models requires running thousands of specialised chips for weeks or months. Those chips need electricity. That electricity, depending on where data centres are located, may come from coal, gas, or renewables.
A 2019 study from the University of Massachusetts estimated that training a single large natural language model can emit roughly 300 tonnes of CO₂ — comparable to the lifetime emissions of five American cars.
Since then, models have grown dramatically larger. The energy bill has grown with them.
Google reported that in 2023, its data centres consumed roughly 24 terawatt-hours of electricity — more than the entire country of Ireland uses in a year. A growing fraction of that powers AI workloads.
And it's not just training. Every time you ask a chatbot a question, a data centre somewhere uses a small amount of energy to generate your response. Multiplied by billions of queries per day, inference (running a trained model) adds up too.
Now for the other side of the ledger. AI is being applied directly to climate challenges — and some of those applications are genuinely transformative.
Traditional climate models run on supercomputers and take days to simulate decades of weather. AI emulators — trained to mimic physics-based models — can run the same simulations thousands of times faster. This lets scientists explore far more scenarios, improving the accuracy of projections and giving policymakers better data.
Electricity grids are bewilderingly complex: supply and demand fluctuate every second, and renewable sources like solar and wind are intermittent. AI systems can forecast demand more accurately, predict when the wind will drop, and dynamically reroute power — reducing waste and enabling more renewables on the grid.
DeepMind applied AI to Google's own data centre cooling systems and reduced cooling energy use by 40%. The same approach is now being used in electricity grid management.
One of the bottlenecks in clean energy is materials. Better batteries require new chemistries. More efficient solar panels require better semiconductors. Discovering these materials through lab experiments takes years. AI can screen millions of candidate materials computationally, predicting which are worth synthesising.
In 2023, Google DeepMind's GNoME model predicted over 2.2 million stable new inorganic crystal structures — a potential treasure trove for battery and solar panel developers.
If AI can speed up the discovery of better battery materials from decades to years, how much of its own carbon cost might it offset? Is the arithmetic straightforward — or does it depend on timing?
Agriculture accounts for roughly 10% of global greenhouse gas emissions. AI-powered precision agriculture — using sensors, satellite imagery, and machine learning — helps farmers apply fertiliser and water only where needed, reducing nitrous oxide emissions and water waste.
Is AI net positive or net negative for the climate? That depends heavily on:
There's a real risk of rebound effects: AI makes energy use more efficient, but that efficiency makes energy cheaper, which encourages more use. Efficiency gains can paradoxically lead to higher total consumption.
The technology is not destiny. Choices made by governments, companies, and individuals about how and where AI runs will determine whether it helps or hurts.