Why AI and saving energy might be opposing forces

The increasing energy consumption of data centres could be a significant problem, according to a BBC report on the electricity consumption of AI. The report notes that according to a recent research study by Cornell University, a Generative AI system might use around 33 times more energy than machines running software that performs a specific task. The increased energy usage might hamper some organisation’s efforts to reduce their IT carbon footprint.


► Gen AI systems might use around 33 times more energy than single task systems

► Electricity consumption of data centres will double by 2026


The report notes that whenever a query is posted, the large language models (LLMs) being used will activate the entire AI infrastructure within a data centre as it searches through the vast repository of data on which it has been trained.

According to the report, data centres are now using massive amounts of electricity. Figures from the International Energy Agency (IEA) estimate that they used 460 terawatt hours in 2022 and that this will double by 2026 to reach almost 1,000 terawatts hours. This is about the same amount of electricity used by Japan, which has a population of 125 million.

The issue has already caused some countries to act on the building of data centres – as we reported on Newsflash.

The story underlines the need for organisations to consider carefully how they use and deploy AI alongside their other IT and business objectives. TD SYNNEX has experts in both these areas and if you’d like to discuss the needs of your customers and how they can get the balance right between embracing new technologies to deliver positive business outcomes, and becoming more carbon efficient, please use the link below to contact our teams.

You can find the original BBC report here, and information on the university study here.