Join

It’s been over a decade since artificial retinas first began helping the blind see. But for many people, whose blindness originates beyond the retina, the technology falls short. Which is why new research out of Spain skips the eye entirely, instead sending signals straight to the brain’s visual cortex.

Amazingly, 15 years after losing her sight, Bernardeta Gómez, who suffers from toxic optic neuropathy, used the experimental technology to recognize lights, letters, shapes, people—and even to play a basic video game sent directly to her brain via an implant.

According to MIT Technology Review, Gómez first began working with researchers in late 2018. Over the next six months, she spent four days a week dialing in the technology’s settings and testing its limits.

The system, developed by Eduardo Fernandez, director of neuroengineering at the University of Miguel Hernandez, works like this.

Fernandez hopes their efforts can help return sight to many more of the world’s blind people.

A camera embedded in a pair of thick, black-rimmed glasses records Gómez’s field of view and sends it to a computer. The computer translates the data into electrical impulses the brain can read and forwards it to a brain implant by way of a cable plugged into a port in the skull. The implant stimulates neurons in Gómez’s visual cortex, which her brain interprets as incoming sensory information. Gómez perceives a low-resolution depiction of her surroundings in the form of yellow dots and shapes called phosphenes which she’s learned to interpret as objects in the world around her.

The technology itself is still very much in the early stages—Gómez is the first to test it—but the team aims to work with five more patients in the next few years. Eventually, Fernandez hopes their efforts can help return sight to many more of the world’s blind people.

A Brief History of Artificial Eyes


This isn’t the first time researchers have used technology to help the blind see again.

Roughly two decades ago, the Artificial Retina Project brought together a number of research institutions to develop a device for those suffering retina-destroying diseases. The work resulted in the Argus systems, which, like Fernandez’s system, use a camera mounted on glasses, a computer to translate sensory data, and an implant with an array of electrodes embedded in the retina (instead of the brain).

Over the course of about a decade, researchers developed the Argus I and Argus II systems, ran them through human trials, and gained approval in Europe (2011) and the US (2013) to sell their bionic eyes to eligible individuals.

According to MIT Technology Review, around 350 people use Argus II today, but the company marketing the devices, Second Sight, has pivoted from artificial retinas to the brain itself because far more people, like Gómez, suffer from damage to the neural pathways between eyes and brain.

Just last year, Second Sight was involved in research, along with UCLA and Baylor, testing a system that also skips the retina and sends visual information straight to the brain.

A system that also skips the retina and sends visual information straight to the brain.

The system, called Orion, is similar to Argus II. A feed from a video camera mounted on dark glasses is converted to electric pulses sent to an implant that stimulates the brain. The device is wireless and includes a belt with a button to amplify dark objects in the sun or light objects in the dark. Like Fernandez’s system, the user sees a low-resolution pattern of phosphenes they interpret as objects.

"I'll see little white dots on a black background, like looking up at the stars at night," said Jason Esterhuizen, who was the second research subject to receive the device. "As a person walks toward me, I might see three little dots. As they move closer to me, more and more dots light up."

Though the research is promising—it's designated an FDA Breakthrough Device and is being trialed with six patients—Dr. Daniel Yoshor, study leader and neurosurgeon, cautioned the Guardian last year that it's "still a long way from what we hope to achieve."

The Road Ahead


Brain implants are far riskier than eye implants, and if the original Argus system is any indication, it may be years before these new devices are used widely beyond research.

Still, brain-machine interfaces (BMIs) are quickly advancing on a number of fronts.

The implant used in Fernandez’s research is a fairly common device called a Utah array. The square array is a few millimeters wide and contains 100 electrode spikes which are inserted into the brain. Each spike stimulates a few neurons. Similar implants have helped paralyzed folks control robotic arms and type messages with just their thoughts.

Though they’ve been the source of several BMI breakthroughs, the arrays aren’t perfect.

The electrodes damage surrounding brain tissue, scarring renders them useless all too quickly, and they only interact with a handful of neurons. The ideal device would be wireless, last decades in the brain—limiting the number of surgeries needed—and offer greater precision and resolution.

Ferndandez believes his implant can be modified to last decades, and while the current maximum resolution is 10 by 10 pixels, he envisions one day implanting as many as 6 on each side of the brain to deliver a resolution of at least 60 by 60 pixels

In addition, new technologies are in the works. Famously, Elon Musk’s company Neuralink is developing soft, thread-like electrodes that are deftly laced into brain tissue by a robot. Neuralink is aiming to include 3,000 electrodes on their device to chat up far more neurons than is currently possible (though it’s not clear whether there's a limit to how many more neurons actually add value). Still other approaches, that are likely further out, do away with electrodes altogether, using light or chemicals to control gene-edited neurons.

Other approaches, that are likely further out, do away with electrodes altogether, using light or chemicals to control gene-edited neurons.

Fernandez’s process also relies on more than just the hardware. The team used machine learning, for example, to write the software that translates visual information into neural code. This can be further refined, and in the coming years, as they work on the system as a whole, the components will no doubt improve in parallel.

But how quickly it all comes together in a product for wider use isn't clear.

Fernandez is quick to dial back expectations—pointing out that these are still early experiments, and he doesn’t want to get anyone’s hopes up. Still, given the choice, Gómez said she’d have elected to keep the implant and wouldn’t think twice about installing version two.

“This is an exciting time in neuroscience and neurotechnology, and I feel that within my lifetime we can restore functional sight to the blind,” Yoshor said last year.

Image Credit: Harry Quan / Unsplash. By Jason Dorrier. This article originally appeared on Singularity Hub, a publication of Singularity University.

Enjoying this story? Show it to us!

0 Likes

Share your thoughts and join the technology debate!

Be the first to comment