ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
Three Life-Changing Areas of Neural Technology

Engineers are starting to work much more closely with neuroscientists and other researchers to develop and perfect activities that are truly brain-like, much more biological, and multidimensional.

This teamwork will mean breakthroughs in at least three broad areas, which are described below. Mechanical engineers are an important part of these teams because they understand how things work and how structures push against each other in an environment with a lot of complex mechanical movement.

‘Smarter’ Medical Devices

The most developed area is work that results in “smarter medical devices,” says Bradley Greger, principal investigator at the Neural Engineering Lab at Arizona State University. From teamwork among experts in various fields is emerging thinking that neural interfaces need to integrate sensory and motor components. “For decades they were talked about individually. But people are saying, ‘We can’t think about it that way,’” and work on simultaneous motor and sensory control automatic interfaces is underway. The result is a process that replicates how the brain works and will result in devices that will be able to read electrical and chemical signals from the nervous system and respond much more like the human body does.

 

A synergy-based prosthetic hand, called the SoftHand Pro. Image: Jessica Hochreiter / ASU

 

Sensory and motor are tightly interwoven at the neural level, Greger says. “I can’t move if I don’t have good sensation. And I have to move to get good sensation. If I want to move my hand, I can’t do that without sensory input as a guide. It’s the neural interfaces, the physical connections to the brain, that are going to let us do that.”

He says one straightforward example is the control of a robotic arm for someone paralyzed. Current technology is guided by vision. The person has to look and pay careful attention to what they are doing because a sense of touch or arm position is not built into the device. The person thinks of what they want to do, but has to look and see to accomplish it. “That’s not how we really move,” he points out. “The arm has a massively complex sensory system that we unconsciously have access to that helps control our movements without thinking about it.”

There now are groups, including Greger’s own lab, working on neurally controlled prosthesis that incorporate sensors of the arm leading into the brain providing a sense of touch and arm position.

Another project at his lab in this space involves a vision restoration prosthesis for someone who is blind. They are hooked up to a camera that connects directly into the visual processing parts of the brain. “That seems like simple sensory access,” Greger says. “But it’s very complex. Your sense of vision is controlled by and intimately linked with how you move your eyes. The visual system has to know what your eyes are doing and how you move them in order for you to process that information.”

Another device being tested is an implant to control epileptic seizures. “It’s breakthrough in the sense that they are helping people right now. It looks very promising,” Greger says.

Finally, another potentially life-enhancing and promising technology is an implant for people with chronic pain not relieved by medications. The device delivers electrical impulses to the spinal cord to mask pain signals before they reach the brain.

Better Artificial Intelligence

So much of the terminology being used in artificial intelligence comes from neural technology—neuronets or brainlike or cognitive, Greger says. Even so, the artificial intelligence work currently involves only one level of processing that goes on in the brain, individual groups of neurons, or cells that talk to each other through pulses of electricity.

There are other interactions that happen with electrical fields that are part of the computational process, and the complicated architecture of neural circuits are activities that also need to be considered, he says. He likened the work to building a model of a Ferrari and expecting it to behave like a real Ferrari. “That’s not going to happen. You are trying to interface with the brain without taking into account how the structure really functions, that it’s multidimensional, multi-scale and squishy, in terms of both its physical structure and function. We keep trying to put it in non-biological form. Once we start taking biological factors into account, it will give people a lot more knowledge about problem solving for intelligent control of machines,” Greger says.

Building Protoplasmic Circuits

The third big area—neurotechnology that is truly biological—is “where it gets really crazy,” Greger says. “We are seeing the beginning of this with stem cells and artificial tissues. As the fields of cellular neurobiology and systems biology mature, we can start building complex cellular structures and cellular-like tissues, replications of areas of the brain for replacement or for controlling circuits.”

Researchers have started building very simple protoplasmic circuits and programming them to perform certain functions, leading toward autonomous cars being driven by protoplasm, a little brain instead of electrical circuit. “It would be much more powerful in terms of its computational ability than a digital circuit,” he says. “We are decades away from seeing your Amazon delivery drone controlled by protoplasm, but it’s not total science fiction.”

Nancy S. Giges is an independent writer.

We are decades away from seeing your Amazon delivery drone controlled by protoplasm, but it’s not total science fiction.Prof. Bradley Greger, Arizona State University

You are now leaving ASME.org