Skip to main content
Neural engineering

Neural engineering

Novel decoder helps people with paralysis click-and-drag a computer cursor using just their thoughts

08 Oct 2021 Arjan Singh Puniani 
A digital painting by a study participant with paralysis
Connect the dots: A digital painting created by a study participant with paralysis. The Pittsburgh-based team developed a new algorithm for deciphering brain signals in brain–computer interfaces, enabling the person to not just select and apply different colours (point-and-click), but also trace contours and erase numbers (click-and-drag). (Courtesy: J. Neural Eng. 10.1088/1741-2552/ac16b2)

Disability can affect anyone; either directly, over the natural course of our lives, or indirectly, by knowing or caring for a person with a disability. It’s difficult to fathom how profoundly different (and challenging) daily life is after stroke or spinal cord injury, or for those with cerebral palsy or amyotrophic lateral sclerosis (ALS).

Computers can significantly improve the quality of life for people living with motor impairment, by allowing web access to social media, games and messaging, for example. But after a catastrophic injury or disease, actions that we typically take for granted, like clicking or scrolling, become ferociously tricky for those with impaired hand function.

A brain–computer interface (BCI) is a promising and increasingly popular avenue for assisting and improving control following motor paralysis. People with paraplegia or quadriplegia, for example, have used BCIs to move computer cursors with their thoughts for decades; yet they have been unable to click-and-drag.

In a paper published in the Journal of Neural Engineering, a team of neural engineers at the University of Pittsburgh describe a new algorithm for deciphering brain signals in BCIs. By applying machine learning techniques to data recorded from an implanted BCI, the researchers improved cursor control and computer accessibility for people who are unable to move a mouse physically.

Decoding brain signals

Our thoughts and behaviours arise from trains of action potentials flashing across vast, interconnected networks of neurons. A BCI measures those brain signals, analyses them for certain features, and then translates the extracted features into commands to execute the desired action. Thus, BCIs circumvent the normal output pathways of muscles, which illness or infirmity may compromise.

BCIs are not mind-reading devices; they do not extract information from unsuspecting or unwilling users. Instead, the user “works with” the BCI via their brain signals, so that they can actively participate in the world without using their muscles. Making sense of the massive datasets generated by these brain signals requires training periods, which also helps us understand some of the mysterious operations of the brain.

First author Brian Dekleva, from the university’s Rehab Neural Engineering Labs (RNEL), used surgically implanted BCIs to decode movement intent in two people with quadriplegia, with the aim of improving the functionality of BCIs for computer access.

Dekleva and colleagues began their investigation by deconstructing a hand grasp. They employed a well-established machine learning technique, Hidden Markov Models, to rigorously characterize the three sub-routines that make up a grasp action: deciding to grasp (onset), holding a grasp (sustained) and deciding to release (offset).

What sets the researchers’ BCI apart is that it doesn’t just examine the persistent neural signals generated when we want to move or click a cursor. Instead, their decoder looks at transitions between states, which are detected more reliably than a sustained response. The team’s technique is also “generalizable”, because it is suitable for a variety of computer applications that require point-and-click or click-and-drag functionality.

Using the BCI with the new decoding algorithm, the study participants could smoothly sweep their cursors across a monitor, be it for a creative outlet (like painting a digital work of art) or something more routine (like simply dragging a file to the trash).

A promising track record

Most of the research in neural engineering pursues clinically useful systems for people with significant impairments that lead to disability. In May of this year, the RNEL team published a proof-of-principle for a bidirectional BCI – a type of BCI that enables not just data reading but also data writing abilities. In other words, a bidirectional BCI enables patients with paralysis to control a robotic arm with their thoughts and also feel how hard that robotic arm is clutching an object.

Controlling a robotic arm with the mind

BCI technology promises to enhance the quality of life for people with paralysis by improving their autonomy and mobility. Jennifer Collinger, the senior author on this latest study and one of the lead architects of the bidirectional BCI, hopes that these results can inform the development of clinical BCI technology – an area experiencing rapid growth within the biotech industry.

The team’s latest experiment was also proof-of-concept for remote clinical trial participation with BCI tech in the home, with one of the participants performing most of the study’s training sessions at home without assistance from the researchers. This is a critical step towards clinical translation.

The study’s success in enabling a natural and generalizable control scheme for computer access provides increasing evidence that BCI studies no longer need to be restricted to an on-site lab. “The pandemic accelerated our plans for in-home testing, but this has been a goal for a long time,” explains Collinger. “We need to get the technology into real-world environments… We just want study participants to be able to do the things they want to do with a BCI.”

Copyright © 2024 by IOP Publishing Ltd and individual contributors