ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
When Robots Interact with Humans

In the "old days" humans directed robots. Today, however, robots can direct humans on the production line, boosting production by as much as 900%.

The key to these impressive gains is reducing set-up time, especially for new products and small-batch production, which often require a disproportionate amount of time for setting up automated assembly tasks.

Set-up for assembly automation involves assembly planning, fixture tool selection and positioning, and part loading. Even in highly automated manufacturing systems, setting these up is typically expensive and time-consuming, with skilled operators performing laborious manual tasks.

Now a research team at the Robotics Institute at Carnegie Mellon University in Pittsburgh and the Department of Mechanical and Manufacturing Engineering at Aalborg University in Denmark has designed and implemented an automation tool that radically reduces this time.

"Using augmented reality, our automation tool provides the robot with the ability to provide a human operator with precise and convenient instructions, while at the same time allowing the robot to read information supplied by the human operator's actions," says principal systems scientist David Bourne of the Robotics Institute. "In this way, a complex set-up task can be collaboratively executed, while allowing both the robot and the human to do what each does best."

Human-Robot Interaction

The demand for customer product diversity, along with shorter product life-cycles, has led to an increased focus on bringing the skills of human operators (flexibility, adaptability) together with the skills of robots (efficiency, repeatability, speed). This emerging area of hybrid manufacturing systems has especially focused on human-robot interaction (HRI) and collaboration.

The robotic hardware consists of a six-axis robotic arm equipped with an electric gripper. Image: Carnegie Mellon University

"Everyone knows robots handle parts," says Bourne. "But in small-batch runs we actually have the robots setting up the process. So many objects have to be grasped, even though they are easy for people to pick up, they must be placed in a very specific place and position for the robots to grip them. Augmented reality allows the robots to show the operator where to put parts and fixtures, with the positions outlined in laser light."

The robot must have the ability to understand and respond to environmental cues in the workspace. By reading the environment and human intent by means of active vision, the robot can send information to the human through augmented reality (a real-time view of the local work environment that is enhanced by computer-generated sensory input).

The automation tool that Bourne and colleague Dan Gadensgaard developed both displays and "sees" visual information by augmenting reality with complex laser displays and, at the same time, capturing these visual images. "Our system allows laser displays to be registered in the real world so that the projective displays can provide precise 'pointing data' as well as embedded information," says Bourne.

The augmented reality tool is composed of a small laser-projector and a smartphone with a camera in a special fixture. Image: Carnegie Mellon University

The robotic hardware consists of a six-axis robotic arm equipped with an electric gripper and a steel table as a foundation for varied configurations of fixtures, sensors, and tools. Two identical augmented reality tools (smartphone, laser-based projector-camera system, and special fixture mount) are mounted at two different locations.

"The smartphone requires several special features, including programmable graphics, video output, wireless connectivity, and a camera with wireless video streaming," says Bourne. "The projector needs to be laser-based so that it is focus-free and only directly illuminates the graphics. The smartphone in turn streams data to a high-end PC suitable for doing real time computer vision and robot control."

The projector-camera system displays information directly onto objects in an unconstrained environment. Integrating this kind of functionality into a robotic work cell has the advantage that the operator is able to maintain focus on the working environment while receiving information and instructions visually.

Future Possibilities

Bourne tested out his automation tool by making a space frame for a U.S. Department of Defense vehicle. Normally it takes 89 hours to weld the space frame together; using one graduate student and the automation tool, Bourne was able weld the space frame in only 10 hours.

"This is a huge savings in time," says Bourne. "Now, for example, the military has the capability to make a customized vehicle in about a tenth of the time."

Continued miniaturization of the key components in Bourne's automation tool, especially the full-color laser projection, wireless computing, and video streaming, will expand the range of augmented reality tasks that can be accomplished. Adding these functions to automated robotic systems will enhance the ways humans and robots can collaborate on complex tasks without requiring excessive programming. This is especially important in manufacturing, where the time to set up machines is a major cost factor production, especially for smaller and smaller batches of products with shorter life-cycles.

Perhaps the biggest surprise for Bourne was how excited his graduate student was with the results.

"He was incredibly happy," says Bourne. "He had a big smile on his face. That's when I realized that this technology can make people happy. More contentment in the workplace translates into higher productivity and lower turnover."

Mark Crawford is an independent writer.

Using augmented reality, robots can direct humans to set up the production line, boosting efficiency by as much as 900 percent.David Bourne, principal systems scientist, Robotics Institute, Carnegie Mellon University

You are now leaving ASME.org