ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
AI Helps Soldiers Bark Orders Telepathically to Robot Dogs

AI Helps Soldiers Bark Orders Telepathically to Robot Dogs

The latest robotic controls don’t use implants like other brain-machine interface technologies. Instead, the system—aided by machine learning—uses telepathy to control robot dogs.
Artificial intelligence is helping soldiers in the Australian Army tell robot dogs where to move and how to act without saying a word or making any motions.  
 
The latest in robotic controls doesn’t use implants like other brain-machine interface (BMI) technologies, instead, the system reads minds. 
 
The advantages of such a scheme are clear. In combat, fighters and robots can move together silently, coordinate without motion, all while soldiers are able to maneuver weapons and be situationally aware of all battlefield movements.  
 

Brain machine interface

 
The system seeks to replace how soldiers interact with tactical mission robotics. In the past, a soldier was forced to focus on a screen and manually instruct the machine using a hand-held platform. On a battlefield where situations are fluid, a robot operator is not able to look up, interact with teammates, or respond intuitively. Moreover, that soldier already needs to carry a lot of heavy devices.

Researchers command a robot dog using the power of concentration using wearable neural sensors. Photo: University of Technology Sydney
“The precision that the soldier can operate the quadruped robot is determined by the quality of the signals collected by the sensors, the system noise, and the AI classification accuracy,” explained Francesca Iacopi, a University of Technology Sydney (UTS) nanotechnology researcher on the Faculty of Engineering and IT, of the new BMI system that, so far, is 94 percent accurate.  

She explained that trained soldiers use the brain robot interface and send commands “telepathically” to a robotic quadruped from Ghost Robotics. The “dog” is told, via brain signals, for example, to cross an open field toward a series of destinations or work with its team to “clear” a succession of buildings.  
 
The soldier is able to command the robot dog using only the power of concentration, explained lacopi who initially outlined the work with wearable neural sensors for BMIs in the journal, Progress in Biomedical Engineering, “A perspective on electroencephalography sensors for brain-computer interfaces.”
 
Become a Member: How to Join ASME

Essentially a user is prompted with flickering squares that correspond to reality points or other commands that appear on the soldier’s AR lens. These “flickers” vary in frequencies. The biosensors, located at the back of the head, detect biopotentials from the visual cortex and jumps into use when the soldier gazes upon a particular flicker. The decoder platform identifies, classifies and ultimately translates that signal into electronic commands that the robot dog is able to receive.  
 

Biosensors and decoder

 
The wearable BMI system is made of two fundamental parts (biosensors hardware and decoder) that’s very different from Elon Musk’s implantable technology, Neuralink.

The biosensors (worn on the head to detect electrical signals from the brain) are made of epitaxial graphene--multiple layers of very thin, strong carbon--grown directly onto a silicon carbide on silicon substrate.  
 
The challenges to overcome here were possible corrosion, acceptable durability, and being able to sustain optimal contact between the sensor and skin as to promote the detection of the tiny electrical signals from the brain.  
 
Full Coverage: The Robotics Collection 

“We’ve been able to combine the best of graphene, which is very biocompatible and very conductive, with the best of silicon technology, which makes our biosensor very resilient and robust to use,” lacopi concluded. 
 
Her colleague Chin-Teng Lin, distinguished professor in the same department at UTS, worked on the AI brain decoding technology. Essentially, it’s the decoder that translates the information into instructions, like stop, turn right, and turn left, that the machine can understand. 
 
He took on the task of developing the interface that was able to translate the brain’s electrical signal into a format that the robot can accept, and respond to, according to the accuracy of the system. 
 
He and his team have achieved two major breakthroughs in their work so far. First, they figured out how to minimize the noise that every human body creates and that of the surrounding environment. This is necessary if the system is to succeed in real-world scenarios.  
 
The second challenge was to increase the number of commands that the decoder delivers within a set period of time. Current brain-computer technology issues only two or three commands such as to turn left or right or go forward. This technology can issue at least nine commands every two seconds.   
 

AI is key 

 
Voice commands and hand signals are well-established alternative technologies, but BMIs move a step further and directly translates human intent into zeros and ones. According to lacopi, biopotentials detected in this wearable system are from “collective oscillations of neurons in the brain, up to millions.” Further, the brain is in constant activity and it is the system's job to identify, recognize, and classify the elicited event, making machine learning an important key to the success of the project.  
 
“Both sensors and AI are both key parts of the work,” lacopi explained. “Particularly, without proper algorithms to read-out the biopotentials/brain waves and recognize the targeted/elicited event, none of this work would be possible, and without accurate sensor obviously the same would be true.” 
 
The potential of the system on the battlefield does seem limitless since the BMI could include direction of aerial drones, UAV swarms, autonomous ground weaponry, and advanced tiny robot armies. But this prototype has huge potential for application across multiple industries such as medical and disability-assistance. 
 
It doesn’t take any special skill to tell the robots dogs what to do. “Anyone can learn to use the BMI system,” lacopi explained. Its use of off-the-shelf HoloLens headsets and hybrid Raspberry Pi-based AI decoder, make it accessible to young soldiers who are able to get up-to-speed quickly and easily.
 
“Time will tell,” if the system will translate well to systems outside of the military, but a “fully wearable system could mean that the scope could extended to consumer electronics of various types” and could change how we interact with all electronics, she explained. 
 
The collaboration will continue to be funded through the Defence Innovation Hub of the Commonwealth of Australia. Overall, the research is a collaboration of the UTS researchers and the Robotic and Autonomous Systems Implementation and Coordination Office (RICO) of the Australian Army, Future Land Warfare Branch (RICO), and the Defence Science and Technology Group.  
 
Cathy Cecere is Membership Content Program Manager. 
 

You are now leaving ASME.org