AI system recognises gestures from electrical signals in the arm

Industry Updates Trending News
Author: TD SYNNEX Newsflash Published: 30th December 2020

Reliable gesture recognition could completely change the way that we interact with our technology.

You could navigate screens without having to touch them, control smart devices without having to speak, and even drive a car without having to use a physical steering wheel.

AI system recognises gestures from electrical signals in the arm

Most systems under development use a combination of cameras and electronic gloves, but the former rely on a good line of sight, while the latter can be bulky and restrictive.

Now, a team of researchers has developed a system that reads the electrical impulses sent by your brain, effectively recognising the gesture you are going to make milliseconds before you make it.

As the system uses brain signals, it could potentially also be used to control prosthetics on people who have lost a limb, as well as acting as a remote interface for various electronic devices.

The team, from the University of California, Berkeley, developed the system by combining wearable biosensors that detect the electrical signals with artificial intelligence to interpret the results.

The sensor takes the form of a flexible and relatively lightweight armband that is worn on the forearm and is able to pick up signals as they pass through 64 different points.

Sensor picks up signals that are sent from the brain

When a person wants to move, their brain sends an electronic signal with instructions for the relevant muscles to contract.

These signals are sent via neurons in the neck and shoulders to the muscle fibres – in this case located in the hands, wrists and arms.

Ali Moin of UC Berkeley explained: “Essentially, what the electrodes in the cuff are sensing is this electrical field.

“It's not that precise, in the sense that we can’t pinpoint which exact fibres were triggered, but with the high density of electrodes, it can still learn to recognize certain patterns.”

This is where the AI comes in, as the algorithm is able to connect the electrical impulses with certain pre-defined hand gestures.

The AI used was an advanced type known as a hyperdimensional computing algorithm, which is able to update itself as it gains new information.

In laboratory tests, the team was able to teach the AI to recognise 21 distinct hand gestures, including the likes of a thumbs-up, a flat hand, a fist, and the holding up of individual fingers.

All the computing occurs locally on a chip, meaning that biometric information can be kept private.

Today’s news was brought to you by TD SYNNEX – the UK’s number one solutions distributor.

Promoted

Introducing the ProLiant Gen12! Advanced security with AI automation

Did you know? HP Poly solutions are now available through TD SYNNEX Maverick

There is no reason NOT to sell Surface with the Surface Channel Expansion

HP: Claim up to £400 per device when you upgrade to Windows 11