ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
When Drones Display Emotions

When Drones Display Emotions

Drones may gain larger acceptance if fitted with human facial features.
You may not feel drones really “get” you. And why should they? Their role is to take aerial photographs, help locate wildfires, or, one day in the future, deliver you a pizza.

But will they be happy to hover at your door while you take the box from them? The people who think so are more likely to reorder and are less likely to jump back in amazement as a delivery drone passes them on the street, say researchers at Ben-Gurion University (BGU) of the Negev in Beersheba, Israel.

That pizza-delivery day isn’t so far in the future, and drones will be doing much more than flying in with dinner. They’ll also be swooping in on construction sites with just the right tool and gathering geological data for mine location, among many other jobs.

More for You: Construction Robots that Read Your Mind

Drones are rapidly populating human spaces, yet little is known about how these flying robots are perceived and understood by humans.

Recent works suggested their acceptance is predicated upon their sociability, said Jessica Cauchard, assistant professor of human-computer interaction in the BGU department of industrial engineering and management. Cauchard was one of the main researchers for the project.

With that in mind, the researchers wanted to discover how people around the drones would begin to accept them as commonplace. Acceptance would bring with it human-drone interaction that happens easily and with confidence, Cauchard added.

But the researchers also knew humans could likely perceive drones differently than they do the rolling or walking robots, which have had a longer time to gain familiarity and don’t leave the ground.

“There’s a lack of research on how drones are perceived and understood by humans,” Cauchard said.
 

Second that Emotion


She and fellow researchers in the BGU Magic Lab, who study computer-human interaction, recently showed that people can recognize different emotions depicted on the face of a drone and can tell just how strongly the drone “feels” that emotion, Cauchard said.

The researchers conducted two studies using a set of facial expressions that conveys the drones’ basic emotions; that is, the emotions the researchers hoped they’d depicted on the face of the drone.

Reader’s Choice: Quiz: A Look at Agricultural Robots

For the study, they applied four facial features to each drone: eyes, eyebrows, pupils, and mouth. The facial images looked like those we might attribute to human-looking robots; a simplistic face with large eyes and a mouth depicted mostly as a line, a smile, or a frown.

In fact, the researchers said they leveraged design practices from ground robotics to create a set of rendered robotic faces that convey basic emotions.

The results showed that the five different emotions of joy, sadness, fear, anger, and surprise can be recognized with high accuracy in a picture of a drone with a particular arrangement of its features. In videos, the four emotions of joy, surprise, sadness, and anger were recognized. Disgust was the only emotion that was poorly recognized, Cauchard said.
 

Ranking Feelings


What’s more, the participants showed some accuracy in ranking the emotion in a level of intensity. For example, was the scared-looking drone evidencing apprehension, fear or terror?

For instance, research subjects associated wide-open, rounded eyes, and a smiling mouth with what they considered a happy drone. They felt a mad or angry drone had a downturned mouth and eyebrows that sloped downward toward the nose.

Editor’s Pick: Infographic: Robots in the Surgical Suite

“Participants were further affected by the drone and presented different responses, including empathy, depending on the drone's emotion,” Cauchard said. “Surprisingly, participants created narratives around the drone's emotional states and included themselves in these scenarios.”

For instance, one participant said, “I feel it looks like it recognizes me and wants to say hi.” This person may feel warm and friendly toward the drone and would be willing to share its surroundings with it.

Another participant thought, “I feel kind of bad for the drone since it looks so sad. That makes me want to help it.”

Cauchard became part of the research group because she’s long been interested in how people will interact with technologies in the future. Though her main research areas have been human-computer and human-robot interaction, she is “particularly interested in how autonomous devices, such as cars, robots, and drones, will affect our lives and how we can make sure they are built in ways that are acceptable to us all.”

As her research shows, humans see drones with a face as drones with feelings. And our feelings toward drones go a long way in getting us to feel comfortable with them, even feeling comfortable enough to share our world with them in increasingly large measure.

Jean Thilmany writes about engineering and technology in Saint Paul, Minn.
 

You are now leaving ASME.org