An academic has crunched the numbers to show how the energy demands of artificial intelligence (AI) could be equivalent to those of an entire country.
The recent rise in the capabilities and applications of artificial intelligence (AI) has opened up a number of debates about the technology’s possible drawbacks; from students cheating at exams, to machines replacing humans at work, and even the risk of the total destruction of the human race.
Now another potential pitfall of AI could be on the horizon, with experts warning the computing power needed to run the necessary algorithms and machine learning processes could contribute to climate change due to the amount of energy it uses.
One academic, Alex de Vries, a PhD candidate at Vrije Universiteit Amsterdam, has suggested that if every Google search for a year used AI, it would use the equivalent amount of electricity used to power a small country like Ireland.
Writing a commentary in the journal Joule, a sister publication to Cell, he says "looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years".
De Vries is the founder of Digiconomist, a website exposing the unintended consequences of digital trends.
AI has emerged as a growing and crucial digital trend in 2023, and he warns that when adopted more widely, it could have energy demands that exceed those of some countries.
Generative AI is being used by more of the general public each day, with chatbots like OpenAI’s ChatGPT and Midjourney’s image creation tool among the most popular. To create their outputs, the models they are built on require a process of machine learning, where they are fed vast amounts of data.
In his commentary, de Vries points out how Hugging Face, an AI company in the US, has said its multilingual text-generation AI used around 433 megawatt-hours (MWh) during its training, which is enough to power 40 average homes in the US for a year.
It’s not just the training that uses a lot of energy either.
Efficient AI means more demand
De Vries’s analysis shows that when a tool like ChatGPT outputs text based on prompts, it uses "a significant amount of computing power and thus energy".
He says ChatGPT could cost 564 MWh of electricity a day to run.
While developers are working to make their AI tools more efficient, de Vries says this can result in a phenomenon known as Jevons’ Paradox.
"The result of making these tools more efficient and accessible can be that we just allow more applications of it and more people to use it," he said.
De Vries estimates that based on the available data regarding power consumption and AI, if Google were to use AI for its around 9 billion daily searches, it would need 29.2 terrawatt hours (TWh) of power each year - the equivalent to the annual electricity consumption of Ireland.
"The potential growth highlights that we need to be very mindful about what we use AI for," said de Vries. "It’s energy intensive, so we don't want to put it in all kinds of things where we don’t actually need it".
While the scenario above is not likely to happen in the short term, capacity to process the demand for AI is set to increase. De Vries estimates that by 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually, based on the projection of AI server production.
This would be comparable to the annual electricity needs of countries like the Netherlands, Argentina and Sweden.
De Vries has previously covered the energy costs of another buzz-worthy technology, with cryptocurrency transactions requiring using huge amounts of electricity.
Some estimates have suggested Bitcoin, the most popular cryptocurrency, could emit the same amount of carbon dioxide as the entirety of New Zealand each year.