A new threat to our climate: AI
Artificial intelligence (AI) is more accessible than ever. With tools like ChatGPT, Claude, Luma and other generative AI (GenAI) services, we can easily have text, images, and videos created. However, with every new GenAI development, the energy costs and carbon footprint are rising. As more and more people are embracing these technologies, it’s important to understand the significant environmental toll they’re taking.

AI’s energy demands
If data centers were a country, they would have been the 11th largest electricity consumer in the world in 2022. By 2026 they could be bumped up to 5th place. One of the reasons for this huge increase in electricity consumption by data centers is generative AI.
GenAI requires huge amounts of computational power, which is supplied by data centers running 24/7. These centers store and process vast amounts of data, consuming substantial amounts of resources like water and electricity. Between 2022 and 2023, as the demand for AI grew, Microsoft reported a 34% increase in the water consumption of its data centers. The International Energy Agency (IEA) estimates that data centers consumed nearly 2% of global electricity in 2022. Their 2024 report shows that this demand is expected to double by 2026. This year, they reported that by 2030 data processing, mainly for AI, is likely to require more electricity than the combined manufacturing of steel, cement, chemicals, and other energy-intensive goods.
As more companies integrate GenAI into their services, this ever-growing energy demand brings with it a massive environmental cost.
AI’s CO2 emissions
While companies aren’t transparent about the exact amount of carbon emissions resulting from AI activity, we do know that the increase in AI usage is driving up emissions. Google’s 2024 environmental report showed a 48% increase in emissions compared to 2019, largely due to rising energy consumption in their data centers. Other tech giants like Microsoft have also reported higher energy usage as they focus on expanding their AI capabilities.
So how much CO2 does AI actually produce? The full extent remains unclear. Let’s start with how much energy AI consumes, which can be divided into three stages.
1. Training of AI models
The training of an AI model involves vast amounts of data, stored and processed in energy-hungry data centers. For example, training a model like GPT-3 requires about 1 300 megawatt hours of electricity and releases 552 tonnes of CO2. This is the equivalent to the yearly energy consumption of 130 average homes. The bigger the AI model, the more energy is needed.
2. Using AI models
After the initial training, the model is ready for users. While a single request might seem small, it has a larger carbon footprint than a regular web search, as it requires 10 times as much energy. If everyone used ChatGPT instead of a regular search engine for their queries, that alone could add up the equivalent of the emissions produced by 38 passenger planes flying back and forth between London and New York: 7 200 tonnes. Every day! Or almost 7 million people per year.
3. Storing data
Whether it’s AI-generated content, emails, or videos, all data we produce needs to be stored in data centers. The IEA estimates that in 2022, data centers consumed around 460 TWh of energy, which accounted for roughly 1% of global greenhouse gas emissions. By 2026, this number is expected to more than double, as AI demand continues to grow.
The major players in AI and their energy commitments
Amazon Web Services (AWS), Microsoft, and Google are the leading providers of the cloud infrastructure that powers AI. Before the AI hype started, these companies committed to running their data centers on 100% renewable energy by 2030. With growing power consumption, it’s unclear how they’re going to meet that goal. Firstly, despite their green promises, they are still heavily reliant on fossil fuels to meet their energy needs. What’s more, as AI activity has grown exponentially, these companies’ energy usage has increased at a similar rate. There is simply not enough clean energy to meet demand. So even if these companies were to use only green energy, other sectors would have to make do with much less. The only clear solution, therefore, is that all industries must significantly reduce their energy consumption.
AI’s impact on climate efforts
AI’s growing energy demands pose a real threat to the climate, as it could reverse the energy savings achieved in recent years. In many developed economies, efficiency gains have been essential in lowering energy consumption and emissions.
Since it can take up to 9 years to get renewable energy sources like wind or solar farms connected to the grid, the rapid expansion of AI-driven energy demand is likely to be met by the most readily available energy: fossil fuel sources.
In the US, for instance, coal plants are being kept online to meet the surge in energy demand from data centers that need to operate around the clock. This development could severely undermine global efforts to reduce carbon emissions and to phase out coal and gas.

What can be done to mitigate the impact?
There are several things that can be done to help minimize climate damage from AI-related CO2 emissions.
1. Making reporting mandatory
The National Engineering Policy Centre (NEPC) recommends that companies be required to disclose energy and water consumption, as well as CO2 emissions related to AI development and usage. Greater transparency will help us understand and address the environmental impact of AI.
2. Regulating AI energy use
We need regulation to minimize the impact of AI and data centers on the energy system. In addition to mandatory reporting for companies, which will provide the necessary insights, agencies like the IEA need to make practical recommendations to governments on how to regulate.
3. Using renewable energy sources
Though major cloud providers like AWS, Microsoft, and Google have made promises in terms of renewable energy goals, there’s still a long way to go. Clean energy needs to be used across all sectors, including AI. However, with Trump ordering coal plants to remain open in a move to address the energy demands from data centers, tech companies may lack incentive to make good on their promise. Governments will have to step in to hold them to it.
4. Improving data center efficiency
Optimizing AI systems and adopting more energy-efficient technologies, like liquid cooling, can significantly reduce energy consumption. These innovations can help limit the environmental toll of AI.
5. Using AI more mindfully
We, as users, can also take responsibility. For example, by using traditional search engines for simple queries instead of relying on energy-intensive AI systems. Being more conscious about how we use AI can help limit its environmental impact.
Can AI help the climate transition?
Some believe AI could play a big role in tackling climate change. They say it can help identify vulnerable regions, inform climate policies, and boost sustainability efforts. The World Economic Forum even claims AI can make energy systems, transportation, and urban planning more energy efficient.
While these ideas sound great, it’s still very unclear how AI will make them a reality. It often seems more like part of a marketing narrative than a clear roadmap for action. Meanwhile, fossil fuel companies have already implemented AI systems to pump more oil out of wells that were previously considered depleted.
What’s been clear for decades, though, is what causes the climate to change. What is also very clear is what we need to do to address it: cut down on CO2 emissions and absorb more. Scientists have agreed on this for many decades and this should be the real goal.
Moving Forward
The environmental impact of AI needs to be carefully managed. This will require much more regulation, increased energy efficiency, and a shift to renewable energy sources across all industries. If we let AI continue to expand unchecked, we could see its carbon footprint spiral out of control, undoing years of progress in avoiding and mitigating the climate emergency.
