New technology could reduce AI energy demands by 1,000 times

Artificial Intelligence Trending News Published 14th August 2024

Artificial intelligence (AI) is becoming increasingly prevalent in all sorts of different areas.

There are many potential benefits in everything from digital assistants to medical research and data-driven decision-making, but one major concern is the energy cost of training and operating powerful AI systems.

New technology could reduce AI energy demands by 1,000 times

Recent Goldman Sachs research predicted that data centre power demand will nearly triple by the end of the decade, due largely to AI requirements.

Now, though, a team of researchers has developed a new technology that could reduce the energy demands of AI processing by 1,000 times or more.

The new technique, known as computational random-access memory (CRAM), is detailed in the peer-reviewed journal Unconventional Computing.

Essentially, the researchers have created a new paradigm or model that makes a shortcut through the normal process of computations involved in large-scale AI processing.

Much of the power used in conventional AI processing goes into constant data transfers between logic and memory modules – in other words, the components that actually process the data and those that store it.

According to the researchers, this constant back and forth can use 200 times more energy than the actual computation.

New technique performs logic processes in the memory modules

By contrast, the new CRAM system performs logic operations using the memory cells themselves, without the data ever having to leave the memory.

A high-density, reconfigurable spintronic in-memory compute substrate is placed within the memory cells themselves.

The researchers report that CRAM operates in a fully digital fashion, unlike most other reported in-memory computing schemes, which tend to be mostly or partially analogue.

They said that using CRAM provides an energy improvement of around 1,000 times compared to existing systems running AI computing apps.

In some cases, the energy saving could be even greater. One test of a system used to train AI to recognise handwriting achieved more than 2,500 times the energy efficiency and was 1,700 times as fast as an existing near-memory processing system.

The research team has already applied for a number of patents based on the new technology and now plans to work with partners in the semiconductor industry to bring CRAM to real-world applications.

This will involve further larger-scale demonstrations and the development of new hardware that can reap the benefits of the energy saving technique.

Today’s news was brought to you by TD SYNNEX – the UK’s number one solutions distributor.