ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
Game Theory Helps Robot Design

Game Theory Helps Robot Design

New research could create applications for robots working in contact with humans.

An unsupervised robot must have two fundamental skills to work closely and safely with a human on a complex task in real time. It must be able to recognize human behaviors and respond to them automatically and appropriately.

Now researchers at the Imperial College in London, University of Sussex, and Nanyang Technological University in Singapore are the first to exploit game theory in designing a physically interactive robot that adapts to changing human behavior.

Register today for ASME’s Offshore Wind Summit

A human subject is adopting different roles during arm reaching movements while interacting with the robotic interface. Image: Courtesy of researchers

“We have developed an algorithm based on game theory that enables a robot to identify the human’s behaviors and then automatically adjust its own behavior to complete a task,” said lead author Yanan Li from the University of Sussex, who conducted the work while at the Imperial College’s Department of Bioengineering. The European Commission has funded their research.

This game-theory framework could be used someday for robotics in sports training, injury rehabilitation, or assisted driving.

Editor’s Choice: Air Taxi Aces Test Flight

Existent robots can provide rehabilitation assistance by making a task easier for people to complete. But when robots only provide assistance, some patients start to slack off, letting robots do all the work, so their rehabilitation doesn’t progress.

Robots are also used to make a rehabilitation task more difficult by providing a challenge. But if the task is too hard, patients may not be able to perform the task and also don’t improve.

A single robot can be programmed to provide either assistance or a challenge, but one has not been capable of transitioning smoothly between assisting and challenging without reprogramming.

Big Webinar: How to Design a Wind Turbine in 25 Minutes Li’s paper, recently published in Nature Machine Intelligence, shows how a robot controller can transition between rehabilitation tasks by exploiting game theory to identify a human’s strategy. In game theory, multiple players compete or collaborate to complete a task. Each player tries to optimize their performance, while assuming their opponents will also play optimally.

Listen to the Latest Episode of ASME TechCast to Find Out How a High School Senior Helped a Wounded Marine

First, the robot controller is programmed to perform a reaching task with a handle.

“The robot’s motor predicts its reaching motion—how far the handle will move—because it knows how much input to the motor will create that motion,” Li said.

Next, the robot controller is programmed to track how much force a human applies on the handle to move it.

“The robot recognizes that the motion of the handle, when the human is trying to move it, is different from what the robot does alone,” Li said. “Based on this difference, the robot will know how much of the input is from the human. The robot uses the difference between its own motion and the actual motion during the human-robot interaction to estimate the human’s strategy.”

Researchers tested the robot controller in physical-rehabilitation simulations and in experiments with human-robot interactions. In simulations, the robot could adapt when a human’s capability changed slowly or when the human made erratic progress. In human experiments, the robot aided healthy individuals by increasing assistance when the user was not strong enough to complete the task. The robot could also automatically switch from an assistance to a challenge strategy as the human’s strength improved.

More on Helpful Robots: Robots to the Rescue

The game-theory-based system allows the robot to assess where a human’s needs are along the spectrum from assistance to resistance and automatically tunes the controller. The controller gains data about how effectively the human-robot interaction is achieving its goals.

As the robot determines the appropriate level of the two strategies of assistance and resistance, it can update the patient’s progress and estimate how much to increase the two.

“This is an important paper,” said Lena H. Ting who specializes in the neural control and biomechanics of human movement at the Georgia Institute of Technology’s Institute for Robotics and Intelligent Machines. She and Luke Drnach, a graduate student at Georgia Tech, published a companion explanatory article about Li’s study in the same issue of Nature Machine Intelligence. They did not participate in the Li team’s work.

Top Story: Solving World Hunger with 3D-Printed Food

The game-theory framework yields theoretical insights that could help the field of physical human-robotic interactions move forward.

“In our research, we are studying the principles of how humans move and interact physically with each other in order to understand how assistive robot should best interact with people,” Ting said. “We want to understand conscious and unconscious physical cues that occur between people, so that robots can also have this natural, intuitive physical interaction with people. We want robots to get accurate information from people that allow them to modify their own behaviors.”

Future studies, noted Ting and Drnach’s article, could extend this game-theory framework to include teams of robots helping humans with dangerous or difficult tasks, or robots that interact with multiple joints of a human, such as robotic gait trainers and exoskeletons.

Next, Li’s team will apply the interactive control behavior to robot-assisted neurorehabilitation and to driving in semi-autonomous vehicles.

John Tibbetts is a freelance writer based in Charleston, S.C.

Read More Exclusive Stories from ASME.org: Six Project Management Tips Every Engineer Needs Innovations in Biomaterials Create New Roles for Engineers The Engineering Behind Brain Research

We want to understand conscious and unconscious physical cues that occur between people, so that robots can also have this natural, intuitive physical interaction with people. Lena H. Ting, Georgia Institute of Technology

You are now leaving ASME.org