The rapid rise of artificial intelligence has sparked intense debate about its environmental impact. As generative AI tools like ChatGPT become part of our daily routines and tech giants pour billions into AI infrastructure, the question becomes pressing: just how damaging is this technology for the planet?
The numbers are staggering
Apple announced plans to spend $500 billion on manufacturing and data centres in the US over the next four years. Google expects to spend $75 billion on AI infrastructure alone in 2025. This massive investment comes with an enormous energy price tag.
Scientists have estimated that the power requirements of data centres in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, partly driven by the demands of generative AI. To put this in perspective, globally, the electricity consumption of data centres rose to 460 terawatt-hours in 2022. This would have made data centres the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatt-hours) and France (463 terawatt-hours).
The reality is even more concerning when you consider where this energy comes from. The carbon intensity of electricity used by data centres was 48% higher than the US average, meaning they’re drawing from dirtier sources of power than the typical American household.
What happens when you use AI?
Each time you ask ChatGPT a question or generate an image, you’re triggering a surprisingly energy-intensive process. The whole process can take up to 10 times more energy to complete than a regular Google search, according to the Electric Power Research Institute.
Recent research provides some concrete figures. In June 2025, OpenAI executive Sam Altman stated that the average ChatGPT query used approximately 0.34 Wh (1.2 kJ) of electricity and 8.5 × 10^-5 US gal (0.32 ml) of water. While that might sound minimal for a single query, the scale matters enormously. ChatGPT is now estimated to be the fifth-most visited website in the world, just after Instagram and ahead of X.
For context, generating one image takes as much energy as fully charging your smartphone, according to research from Hugging Face and Carnegie Mellon University. This has significant implications as these tools become integrated into everything from email to social media platforms.
The energy needed to train them up
Creating AI models requires a substantial amount of energy before they can answer their first query. The power needed to train and deploy a model like OpenAI’s GPT-3 is difficult to ascertain. In a 2021 research paper, scientists from Google and the University of California at Berkeley estimated the training process alone consumed 1,287 megawatt hours of electricity (enough to power about 120 average U.S. homes for a year), generating about 552 tons of carbon dioxide.
The trend towards larger models makes this worse. Fundamentally, it is just computing, but a generative AI training cluster might consume seven or eight times more energy than a typical computing workload, according to MIT research scientist Noman Bashir. As models become more sophisticated, their energy requirements also increase.
The water crisis
Energy consumption isn’t the only environmental concern. AI systems require massive amounts of water for cooling the servers that power them. In a 2025 research paper, researchers projected that AI would withdraw between 4.2 and 6.6 billion cubic metres of water in 2027, more than half of the total water withdrawal of the United Kingdom.
The scale of individual projects is eye-watering. One data centre that Microsoft had considered building near Phoenix, due to increasing AI usage, was likely to consume up to 56 million gallons of fresh water each year, equivalent to the water footprints of 670 families. This comes at a time when many regions are experiencing water stress and drought conditions.
Corporate emissions are soaring
The companies building AI systems are seeing their environmental commitments tested. Microsoft said its emissions grew by 29% since 2020 due to the construction of more data centres that are “designed and optimised to support AI workloads”. Similarly, Google’s 2023 GHG emissions were almost 50% higher than in 2019, largely due to the energy demand tied to data centres.
These increases come despite both companies having made ambitious climate commitments. The pattern suggests that the rapid deployment of AI is outpacing efforts to decarbonise the energy systems that power it.
But wait, there’s a counterargument
Before concluding that AI is an environmental disaster, it’s worth considering some surprising research findings. A study published in Nature found that AI systems emit between 130 and 1500 times less CO2e per page of text generated compared to human writers, while AI illustration systems emit between 310 and 2900 times less CO2e per image than their human counterparts.
This suggests that for specific tasks, AI might actually be more environmentally efficient than human alternatives. The calculation includes the full lifecycle of human activity – the energy used to power homes and transport people, not just the direct work output.
However, this comparison has significant limitations. It doesn’t account for the fact that AI might increase overall consumption rather than simply replacing human activity. If AI makes content creation so cheap and easy that we produce vastly more of it, the net environmental impact could still be negative.
The bigger picture
Looking at global emissions, AI’s current contribution remains relatively small. Recent IEA research states that data centres are among the fastest-growing sources of emissions globally, but also that these emissions will remain below 1.5% of the total for the energy sector between now and 2035.
However, this is changing rapidly. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028, more than half of the electricity going to data centres will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.
The International Energy Agency suggests that the widespread adoption of existing AI applications could lead to emissions reductions that are far larger than emissions from data centres, but also far smaller than what is needed to address climate change. In other words, AI might help with climate solutions, but it’s no silver bullet.
What can be done
There are several approaches to reducing AI’s environmental impact:
- Technical solutions: Researchers are working on more efficient algorithms, better hardware, and optimisation techniques. Taking some simple steps can make a significant dent in AI data centre emissions, potentially shaving 10% to 20% off global data centre electricity demand.
- Location matters: Running AI systems in regions with cleaner electricity grids can significantly reduce their carbon footprint. The carbon footprint of AI in places where the power grid is relatively clean, such as France, will be much lower than it is in places with a grid that is heavily reliant on fossil fuels, such as some parts of the US.
- Transparency: Currently, when you query most AI models, whether on your phone within an app like Instagram or on the web interface for ChatGPT, much of what happens after your question is routed to a data centre remains a secret. Greater transparency about energy consumption could help users make more informed choices.
So, how bad is AI really for the environment?
The answer is nuanced. AI’s current environmental impact is significant and growing rapidly, but it’s not yet catastrophic on a global scale. The real concern lies in the trajectory: we’re in the early stages of what could be an exponential increase in AI usage.
The key issue isn’t whether AI has an environmental cost – it clearly does. The question is whether society will implement the necessary measures to manage that cost while the technology is still in its relative infancy. This includes improving efficiency, transitioning to cleaner energy sources, and being more thoughtful about which applications truly benefit from AI’s capabilities.
If you look at the history of computational advances, I think we’re in the ‘amazed by what we can do, this is great, let’s do it phase’, as one Columbia University researcher put it. The challenge is moving beyond that phase to one where environmental considerations are built into every decision about AI development and deployment.
The answer to “how bad is AI for the environment?” ultimately depends on the choices we make now. With proper planning, regulation, and investment in clean energy, AI’s environmental impact could be managed. Without these measures, we risk creating a technology that undermines our climate goals at a time when we need to reduce emissions most rapidly.



