The 'real cost' of AI's energy footprint revealed and it's having a shocking impact on the planet

Home> News> AI

The 'real cost' of AI's energy footprint revealed and it's having a shocking impact on the planet

All that clever tech comes at a heavy cost for the environment

The growing carbon footprint of AI is starting to raise big questions about its environmental cost.

AI is undoubtedly integrated into our everyday lives. If it's not powering the latest iOS update or tailoring personal Netflix recommendations, it's writing job applications and predicting the future. For some, it's creating pictures and videos for content while acting as a romantic companion for others.

Hundreds of millions of people are using chatbots and AI tools more than ever - but at what cost?

We’ve already heard from OpenAI’s Sam Altman that even being polite with ChatGPT prompts costs the company 'tens of millions of dollars'. Meanwhile, Professor Sajjad Moazeni warned that the energy consumption of a ChatGPT enquiry is 'probably ten to 100 times more power hungry' than sending an email.

Now, a new deep-dive analysis by MIT Technology Review has provided one of the most comprehensive views on how much energy AI really uses.

ChatGPT receives one billion messages a day (NurPhoto/Contributor/Getty Images)
ChatGPT receives one billion messages a day (NurPhoto/Contributor/Getty Images)

The team spoke to two dozen experts measuring AI’s energy demands, reviewed hundreds of pages of technical documents and forecasts, and questioned big-time AI companies about how their models are built and run. It seems the common perception of AI’s energy use - at least the way big tech is selling it - is way off.

The report argued that 'it’s likely that our AI footprint today is the smallest it will ever be'.

If we look at the numbers, projections from the Lawrence Berkeley National Laboratory, found that more than half of the electricity going to data centres will be used for AI by 2028.

More specifically, AI could use between 165 and 326 terawatt-hours of electricity annually by 2028. That’s more than all electricity currently used by US data centres for all purposes and enough energy to run 22% of all US households, as per the MIT report.

To put that into perspective, this amount of power could generate the same emissions as driving over 1,600 round trips to the Sun from Earth.

'It’s likely that our AI footprint today is the smallest it will ever be' (Andriy Onufriyenko/Getty Images)
'It’s likely that our AI footprint today is the smallest it will ever be' (Andriy Onufriyenko/Getty Images)

Chatbots like ChatGPT and Google's Gemini are a big part of this surge. Already, ChatGPT receives one billion messages a day and is now estimated to be the 'fifth-most visited website in the world, just after Instagram and ahead of X.'

When OpenAI launched its image generator, users were soon creating 78 million images a day from Studio Ghibli–style portraits to what it predicted an AI god to look like.

“AI data centres need constant power, 24-7, 365 days a year,” said Rahul Mewawalla, the CEO of Mawson Infrastructure Group.

Most of that power comes from fossil fuels and despite the push for solar energy, electrical grids still heavily rely on them.

“It’s not clear to us that the benefits of these data centres outweigh these costs,” explained Eliza Martin, a legal fellow at the Environmental and Energy Law Program at Harvard and a coauthor of the research. “Why should we be paying for this infrastructure? Why should we be paying for their power bills?”

As the demand for AI adoption rises, the environmental impact is only becoming harder to ignore. Scientists have already warned that climate change is accelerating at a pace we may no longer be able to control. If the current trend continues, this hidden cost might end up doing more damage than anyone bargained for.

Featured Image Credit: Andriy Onufriyenko/Getty Images