ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
Are You Ready to Trust Robotic Systems?

Are You Ready to Trust Robotic Systems?

In the future, we’ll depend on robots for everything from driving to construction. Researchers look at ways to program robots, so we trust them with these duties.
Self-driving vehicles, small robots on production lines, drones flying rescue missions, even robots that keep older people company: these are some of the proposed, not-so futuristic, ways robots will aid us.
But all those scenarios won’t be possible without trust. Humans need to feel secure enough around robots and robotic systems to rely on them.
 
“Building human-robot trust into autonomous robotic systems like self-driving vehicles is key to the systems’ success,” said Ryan Williams, a Virginia Tech assistant professor of electrical and computer engineering.
 
“As we readily observe in human teams, collaboration without trust is often ineffective or even counterproductive,” he said.
 
Other research institutions worldwide look at how humans can build confidence with robotic systems, whether the robot drives the bus or greets you warmly at your local bank.
 
“Our primary focus is on investigating how we can create processes that will build trust in these systems, rather than just building the technologies themselves,” said Xi Jessie Yang, University of Michigan assistant professor of industrial and operations engineering. She has spent her career studying the subject.
 
“We need to be confident that we have considered all aspects of what is needed to make them work effectively in the real world, and doing this requires we start at the beginning of the development process,” she said.
 
Below, we look at five types of robots that scientists and engineers hope we can accept without question.

 

Search and Rescue Drones

 
Tracking a hiker lost amid miles of wilderness isn’t a job for one person or even teams of people. Drones can cover much more ground and get an aerial view into the trees below them.
 
Williams’ team at Virginia Tech is developing autonomous drones that work with human searchers. In the inherently tense, ever-changing atmosphere of a search-and-rescue mission, humans need to know they can rely on drone feedback. If a drone takes an image of what could be a lost hiker, humans who believe the image to be true are quick to respond. For instance, a team that’s suspicious of the image may say it’s an errant ray of light.

Recommended for You: Robots Replace Aging Workers
 
The trust comes from how the computer is programmed. To develop trust algorithms, the researchers ran a mock missing-person search during which they compared how human searchers and drones behaved in the same situation. Researchers can further fine-tune their trust algorithms used for the drones with information from that comparison.
 

Manufacturing: A Face with a Name

 
“The human workforce quickly accepts Sawyer, thanks to his friendly design,” according to Rethink Robotics, which made the famous Sawyer cobot, collaborative robots that work with humans on manufacturing lines.
 
Rethink Robotics’ Sawyer Black Edition is equipped with a human-like face (eyes and eyebrows) that changes expression (bewilderment, for example, if it doesn’t understand a task). When Sawyer is about to move an arm toward an object, it looks toward it, just as a human would do. The company bequeathed Sawyer and its predecessor Baxter with human-like names.
 
Endowing Sawyer with human-like qualities helps fellow workers feel safe and secure around a cobot that may be standing just feet away, said Daniel Bunse, Rethink Robotics chief executive officer.
 

Aware of Autonomous Vehicles

 
Passengers in an autonomous vehicle (AV) need to know what the AV senses, including bright sun, dusk, or rain. This gives passengers confidence that the AV is constantly aware of its surroundings, said Jack Weast, senior principal engineer at Intel and vice president of Automated Vehicle Standards at Mobileye. He and two other Intel executives have studied trust-building in AVs and written papers on the subject.

Read Our Latest Blog: A Robot’s Machine Vision is the Key to its Future
 
Offering a variety of communication methods is important, Weast said. Voice interactions, large displays, smaller touchscreens, and the passengers’ own mobile devices can all work in different ways to help passengers notice and understand information.
 
“This is particularly meaningful because passenger attention is likely to be focused on other activities when driving is no longer necessary,” Weast said.
 

Social Services

 
Robots with social skills would make for more positive human-robot interactions. For instance, a robot in an assisted living facility could become a companion of sorts for an elderly person.
 
Researchers at the Massachusetts Institute of Technology program robots to perform some of the instinctive niceties of human interactions in a quest to endow robots with some kind of social skills. Their study set up a simulated environment in which one robot guesses what a companion robot wants to accomplish and then helps the other robot.
 
“I feel like this is the first very serious attempt for understanding what it means for humans and machines to interact socially,” said Boris Katz, principal research scientist and head of the InfoLab Group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL).
 
The researchers also showed that their model creates realistic and predictable social interactions. When they showed videos of these simulated robots interacting with one another to humans, the human viewers mostly agreed with the model about what type of social behavior was occurring, Katz said.
 

Safe Pedestrian Crossing

 
Pedestrians following a walk sign to cross the street may be flustered to see a stopped vehicle with no driver. An AV doesn’t waive you across an unmarked intersection, for instance.

Editor's Pick: The Social Relationships of Human and Robots
 
Researchers at the University of Michigan study how different driving behaviors affect pedestrians’ trust in the autonomous vehicle systems, said Lionel Robert, Ph.D., a core scientist in the university’s Robotics Institute. Yang is also a member.
 
They want pedestrians to feel safe—but not too safe—around driverless vehicles.
 
The researchers wanted to pinpoint an optimum level of pedestrian trust that makes pedestrians feel comfortable crossing roads but not so comfortable as to encourage risky behavior. Future automated cars could adjust their driving behavior to hit this right amount of trust.
 
To find that level, study participants entered into a virtual reality setup, equipped with an omnidirectional treadmill, typically used for gaming, that allows walking in any direction, Robert said.
 
“What we found, unsurprisingly, is that pedestrians trusted vehicles that adopted the slower, more defensive driving style,” he said.
 
Aggressive driving, on the other hand, lowered trust ratings.
 
Bottom line, researchers work on updating an old axiom: If you can’t trust a robot, who—or what—can you trust?
 
Jean Thilmany is a science and technology writer based in Saint Paul, Minn.

You are now leaving ASME.org