In recent years, generative AI has made massive leaps in its progression, utility, and daily use. In fact, AI has an adoption rate that is increasing more rapidly than previous technologies, such as the PC or the internet. ChatGPT alone has 300 million weekly users, which, on a daily basis, consume the energy equivalent to powering 35,000 US homes annually. This must make you wonder: what is generative AI’s impact on the environment?
The amount of electricity demanded just to train generative AI leads to increased carbon dioxide emissions and pressure on the electric grid, not to mention how deploying these models for public use and real-world applications allows millions to use them every day, which, along with further fine-tuning of the AI, continues to take colossal amounts of electricity long after it has first been manufactured.
Even if you ignore AI’s electric demands, the amount of water needed to cool the hardware for training, deploying, and fine-tuning AI is staggering. For example, Google’s single Iowa data centre consumed 1 billion gallons of water in 2024 alone. While major tech companies collectively used 580 billion gallons of water in 2022 for AI operations.
Some people think that the environmental impact from AI is just how much electricity is needed to power to plug your computer, but there's much more than that. From the training of the AI, to the water to cool hardware, and even to the impacts of manufacturing and transporting high-performance computing hardware necessary to run the AI.
In fact, generative AI can consume up to seven or eight times more energy than typical computing networks.
During the training for Google’s ChatGPT-4, the AI consumed over 50 gigawatt-hours, enough energy to power San Francisco for 3 whole days. This makes it the most energy-intensive AI model to date. Not to mention that 60-70% of total energy consumption is now made up by AI inference.
As for the carbon footprint, according to allaboutai.com’s analysis, “AI data centers now generate 2.5–3.7% of global greenhouse gas emissions, officially surpassing the aviation industry’s 2% contribution while growing 15% annually.” Also stating that in the 12 months leading up to August 2025, AI data centers generated 105 million metric tons of carbon dioxide.
Looking at the data projections for AI’s impacts on the environment doesn’t make one feel any better. It is thought that global AI electricity consumption will more than double by 2030, representing 3% of total global electricity demand, while still growing annually. And, as AI workloads grow in their need for cooling, it is thought that by 2028, the US data centres could quadruple the amount of water consumed at a whopping 68 billion gallons, as opposed to the 17 billion gallons of water used in 2023.
To conclude, while AI is definitely a useful tool, you can’t look past the immense impacts it has on the environment. In our current age, it would be foolish to dismiss AI as something other than a tool for the future, but it isn’t unreasonable to demand that companies developing AIs take steps to consider and preserve our environment.
I hope that anyone reading this will take into consideration the impacts of AI next time they open up ChatGPT or whatever other AI, and think wisely about whether what they are using it for is worth it or inconsequential.