neuron-image-1152.jpg

University of Minnesota technology allows amputees to control a robotic arm with their mind

Research team makes mind-reading possible with electronics and AI

MINNEAPOLIS / ST. PAUL (05/17/2022)—University of Minnesota Twin Cities researchers have developed a more accurate, less invasive technology that allows amputees to move a robotic arm using their brain signals instead of their muscles. 

Many current commercial prosthetic limbs use a cable and harness system that is controlled by the shoulders or chest, and more advanced limbs use sensors to pick up on subtle muscle movements in a patient’s existing limb above the device. But, both options can be cumbersome, unintuitive, and take months of practice for amputees to learn how to move them.

Researchers in the University’s Department of Biomedical Engineering, with the help of industry collaborators, have created a small, implantable device that attaches to the peripheral nerve in a person’s arm. When combined with an artificial intelligence computer and a robotic arm, the device can read and interpret brain signals, allowing upper limb amputees to control the arm using only their thoughts. 

The researchers’ most recent paper is published in the Journal of Neural Engineering, a peer-reviewed scientific journal for the interdisciplinary field of neural engineering. 

The University of Minnesota-led team’s technology allows research participant Cameron Slavens to move a robotic arm using only his thoughts. Video by Eve Daniels.

“It’s a lot more intuitive than any commercial system out there,” said Jules Anh Tuan Nguyen, a postdoctoral researcher and University of Minnesota Twin Cities biomedical engineering Ph.D. graduate. “With other commercial prosthetic systems, when amputees want to move a finger, they don’t actually think about moving a finger. They’re trying to activate the muscles in their arm, since that’s what the system reads. Because of that, these systems require a lot of learning and practice. For our technology, because we interpret the nerve signal directly, it knows the patient’s intention. If they want to move a finger, all they have to do is think about moving that finger.”

Nguyen has been working on this research for about 10 years with University of Minnesota Department of Biomedical Engineering Associate Professor Zhi Yang and was one of the key developers of the neural chip technology. 

The project began in 2012 when Edward Keefer, an industry neuroscientist and CEO of Nerves, Incorporated, approached Yang about creating a nerve implant that could benefit amputees. The pair received funding from the U.S. government’s Defense Advanced Research Projects Agency (DARPA) and have since conducted several successful clinical trials with real amputees. 

The researchers also worked with the University of Minnesota Technology Commercialization office to form a startup called Fasikl—a play on the word “fascicle” which refers to a bundle of nerve fibers—to commercialize the technology.

“The fact that we can impact real people and one day improve the lives of human patients is really important,” Nguyen said. “It’s fun getting to develop new technologies, but if you’re just doing experiments in a lab, it doesn’t directly impact anyone. That’s why we want to be at the University of Minnesota, involving ourselves in clinical trials. For the past three or four years, I’ve had the privilege of working with several human patients. I can get really emotional when I can help them move their finger or help them do something that they didn’t think was possible before.”

A big part of what makes the system work so well compared to similar technologies is the incorporation of artificial intelligence, which uses machine learning to help interpret the signals from the nerve. 

“Artificial intelligence has the tremendous capability to help explain a lot of relationships,” Yang said. “This technology allows us to record human data, nerve data, accurately. With that kind of nerve data, the AI system can fill in the gaps and determine what’s going on. That’s a really big thing, to be able to combine this new chip technology with AI. It can help answer a lot of questions we couldn’t answer before.”

Robotic arm lying on a table
When combined with an artificial intelligence computer and the above robotic arm, the University of Minnesota researchers' neural chip can read and interpret brain signals, allowing upper limb amputees to control the arm using only their thoughts. Photo credit: Neuroelectronics Lab, University of Minnesota

The technology has benefits not only for amputees but for other patients as well who suffer from neurological disorders and chronic pain. Yang sees a future where invasive brain surgeries will no longer be needed and brain signals can be accessed through the peripheral nerve instead. 

Plus, the implantable chip has applications that go beyond medicine. 

Right now, the system requires wires that come through the skin to connect to the exterior AI interface and robotic arm. But, if the chip could connect remotely to any computer, it would give humans the ability to control their personal devices—a car or phone, for example—with their minds.

“Some of these things are actually happening. A lot of research is moving from what’s in the so-called ‘fantasy’ category into the scientific category,” Yang said. “This technology was designed for amputees for sure, but if you talk about its true potential, this could be applicable to all of us.”

In addition to Nguyen, Yang, and Keefer, other collaborators on this project include Associate Professor ​​Catherine Qi Zhao and researcher Ming Jiang from the University of Minnesota Department of Computer Science and Engineering; Professor Jonathan Cheng from University of Texas Southwestern Medical Center; and all group members of Yang’s Neuroelectronics Lab in the University of Minnesota’s Department of Biomedical Engineering.

Read the full paper entitled, “A portable, self-contained neuroprosthetic hand with deep learning-based finger control,” on the IOP Science website.

Share