Technology

Artificial Intelligence to detect hand gesture electronically

A new device developed by the engineers at the University of California to recognize hand gestures through electrical signals of the forearm

The journal Nature Electronics reveals that scientists have set a goal to control prosthetics or to interact with almost any type of electronic device through their latest innovation which couples wearable biosensors with artificial intelligence (AI). Ali Moin, who helped design the device as a doctoral student in UC Berkeley’s Department of Electrical Engineering and Computer Sciences said, “Prosthetics are one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers.” Reading hand gestures is one way of improving human-computer interaction. This is a good solution that also maintains an individual’s privacy.”

The hand gesture recognition system, also collaborated with the team of Ana Arias, a professor of electrical engineering at UC Berkeley, to create and design a flexible armband that can read the electrical signals at 64 different points on the forearm. The electrical signals are then fed into an electrical chip, which is programmed with an AI algorithm capable of associating these signal patterns in the forearm with specific hand gestures.
The team succeeded in teaching the algorithm to recognize 21 individual hand gestures, including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers.

Moin said, “Essentially, what the electrodes in the cuff are sensing is this electrical field. It’s not that precise, in the sense that we can’t pinpoint which exact fibers were triggered, but with the high density of electrodes, it can still learn to recognize certain patterns.”

Artificial Intelligence to detect hand gesture electronically

Like other AI software, the algorithm has to first ‘learn’ how electrical signals in the arm correspond with individual hand gestures. To do this, each user has to wear the cuff while making the hand gestures one by one.
However, the new device uses a type of advanced AI called a hyperdimensional computing algorithm, which is capable of updating itself with new information.

“In gesture recognition, your signals are going to change over time, and that can affect the performance of your model, hence we improved the classification accuracy by updating the model on the device,” Moin said.
Another advantage of the new device is that all of the computing occurs locally on the chip: No personal data are transmitted to a nearby computer or device.

Jan Rabaey, the Donald O. Pedersen Distinguished Professor of Electrical Engineering at UC Berkeley and senior author of the paper said, “In our approach, we implemented a process where the learning is done on the device itself. And it is extremely quick: You only have to do it one time, and it starts doing the job. But if you do it more times, it can get better.”
While the device is not ready to be a commercial product yet, Rabaey said that it could likely get there with a few tweaks.

“Most of these technologies already exist elsewhere, but what’s unique about this device is that it integrates the biosensing, signal processing and interpretation, and artificial intelligence into one system that is relatively small and flexible and has a low power budget,” Rabaey said.

Show More

Digpu News Staff

Stay tuned to read latest updates and news from all around the world.
Advertisment
Back to top button