ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
Insight from Human Sight
Researchers have developed an apparatus that records the body motion capture and eye-tracking data of people walking over rocky terrain. Image: Michelle Chiou/UT Austin

Machines struggle when faced with unpredictable conditions. Improving their image processing capabilities could help them navigate complex environments. A group is leveraging human vision to sharpen machine sensing by mimicking how the human eye and brain process images.

When hiking a rocky trail, humans maintain a sight line ahead of each step. Machines don't have those instincts. To improve robotic trajectories, a team at University of Texas is studying how humans use vision to traverse rough terrain, using a full-body suit that deploys eye trackers and 17 motion capture sensors.

Jonathan Matthis’s research combines new motion-capture and eye-tracking technologies to determine what is going on in the brain while we walk. Image: UT Austin

“If we could understand how humans move with the kind of precision and grace that we do through natural environments, that would help us design artificial systems that can approximate that type of control,” said postdoctoral scholar Jonathan Matthis, who developed the system.

Before working on this project, Matthis studied locomotion by using multiple cameras to track reflective dots on the bodies of volunteers. When mobile technology brought a wave of smaller, cheaper sensors, Matthis saw an opportunity to do experiments in a natural outdoor environment.

For You: Robots Replace Humans in Infrastructure Inspection

To measure full-body kinematics and eye motion, Matthis wove together off-the-shelf sensors. The motion-capture sensors combine an accelerometer, gyroscope, and magnetometer to collect three-axis data on the suit-wearer's movement. An infrared-illuminated eye-tracking device uses two cameras to follow pupil motion.

It took some engineering to get the system to work. Eye trackers typically use infrared light to follow pupil motion because they work with both dark and light eyes. While they are fine indoors, outdoors the Sun's infrared rays overwhelm them.

To let in visible light but keep out IR wavelengths, Matthis settled on a welding screen, a full-face green plastic visor that shields the eye-tracking sensor without restricting a subject’s field of view.

Calibrating 2D eye-tracking information in a 3D experiment was a challenge that took Matthis into uncharted territory. To do it, Matthis leveraged a human reflex, called the vestibulo-ocular reflex. This works like Newton's third law: If a person moves his or her head while focusing on a given object, their eyes will move in the opposite direction, compensating to keep the same object in view.

By having a volunteer focus on a fixed point while moving his or her head, Matthis could map from 2D eye movements to the 3D environment using eye-tracking and head-motion data.

Among his findings: Humans look two strides ahead on medium terrain, and look at the ground more than 90 percent of the time on rugged paths. In both cases they consistently look 1.5 seconds ahead of their current position.

Next, Matthis plans to study how visual deficits affect motion. He hopes to work with new computer algorithms to elicit more granular vision data, watching exactly what cues subjects use to decide where to step next.

Read the latest issue of Mechanical Engineering Magazine.

Read More:
Robots Make Self-Repairing Cities Possible
Safety and Efficiency, Brick by Brick
The Robotic World of Melonee Wise

You are now leaving ASME.org