The emerging Artificial Intelligence industry will greatly affect the usage of energy. Since 2012 the amount of computing power required to train advanced AI models has doubled every four months.
There is a tug-of-war going on between the beneficial uses of AI to reduce energy consumption by increasing efficiency, and the negative aspects of needing increased computer power to do so. Smart grids, for example, can leverage AI algorithms to analyze real-time data on electricity supply and demand, enabling utilities to balance loads more effectively.
According to recent research by management consultants McKinsey, AI has the potential to deliver energy savings of up to 20% in buildings, 15% in transport systems, and perhaps 20% in utility networks. Additionally AI-driven technology might enable businesses to reduce CO2 emissions by up to 10% and drive down energy costs by 10-20%, which would be a big step to reducing carbon emissions on the pathway to net zero.
Unfortunately, some experts now think that the computing power needed to run these AI algorithms and machine learning processes could greatly increase the electricity demand, thus negating any savings. Dutch academic, Alex de Vries at Vrije University in Amsterdam, Holland, estimates that the energy demands could already be equivalent to those of an entire country like Ireland. As prevalence of AI in mundane tasks increases, like using Google to find the best price for a new washing machine, consumption will expand.
De Vries’s analysis shows that when an AI tool like ChatGPT outputs text based on prompts it uses a “significant amount of computing power and thus energy”. He says that ChatGPT could cost 564MWh of electricity a day to run.
Computing systems designers are working to make their AI software more efficient, but they could fall foul of “Jevon’s Paradox”. This is a phenomenon reported by early English economist William Jevons in 1865. Simply stated it means that energy efficiency tends to increase energy use. During the Industrial Revolution, Jevons noticed that as primitive coal-fired steam engines were improved, their greater efficiency spurred more applications, resulting in more coal use even with higher efficiency. As AI systems become more widespread, they will need more energy, in spite of efficiency gains.
Utilities may need to plan to increase capacity to stay ahead of the new demand coming from AI systems. De Vries estimates that by 2027 world-wide AI related electricity consumption could increase by 85% to 134TWh annually, based on the forecast of AI server production. This would be comparable to the demand in a country like the Netherlands, Sweden or Argentina.
He says, “If we decide we’re going to do everything on AI, then every data center is going to experience effectively a ten-fold increase in energy consumption. That would be a massive explosion in global electricity consumption because data centers – not including cryptocurrency mining – are already responsible for consuming about one percent of electricity.”
This is perhaps a pessimistic analysis. Data centers are becoming more efficient all the time, often using microgrids to supply their own electricity and using their waste heat as a resource for say, heating greenhouses, instead of wasting it. AI could well contribute to energy savings in power and industrial systems to offset the energy it consumes. Smart scheduling of crews and repairs; enhanced consumer satisfaction; predictive maintenance instead of planned; analyzing foliage growth to cut it back before outages or wildfires start; optimizing the supply and demand on the network; there are many areas of utility functioning that could potentially be enhanced by AI.