ASME.MVC.Models.DynamicPage.ContentDetailViewModel ContentDetailViewModel
Robot Turns Artist's Commands to Graffiti

Robot Turns Artist's Commands to Graffiti

Graduate students at Georgia Tech devised a bot that mimics and magnifies human hand motions to produce art.
Gerry Chen has focused on getting robots to do some of the most challenging human tasks. From that perspective, the doctoral candidate at Georgia Institute of Technology believes artistic endeavors, with their complex hand and arm movements, are an obvious target for robots to tackle.
 
Chen is one of a team of researchers at the school who has developed GTGraffiti, a cable-driven robot system that translates human motions while drawing to reproduce graffiti as an art form.
 
The bubble letters that form graffiti are a good starting point for robots to mimic because, unlike complex paintings, there are only a finite number of shapes (26) that the robot needs to learn to reproduce. “A mural might be made of a thousand different unique strokes,” Chen said, “If you’re painting a portrait there might be shading and other advanced techniques involved, so it gets complicated quickly.” Bubble letters for alphabet graffiti on the other hand, need only to draw the outlines and the solid-colored insides can be painted in any pattern.

It's Never Been Easier to Become an ASME Member
 
While a number of robot candidates might fit the bill, Chen and team decided to use a cable-driven system that runs on a set of cables, motors, and pulleys because of its ability to scale to large sizes. “There’s not that many robots that can realistically paint the side of a building,” Chen said. He points out that drones would work too but they’re a bit harder to control. “I wanted to work on the artistic part, not the control part,” Chen said.
 

Translating Motions at Scale

 
The first go-around of the robot involved asynchronous mimicry between the painter and the robot. Data generated from two artists painted the alphabet in graffiti style were analyzed for speed, size, and direction. The researchers converted this data into electrical signals and created a library for each letter.
 
Georgia Tech grad students Michael Qian and Gerry Chen with artwork completed by the GTGraffiti robot. Image: Georgia Tech
Depending on the word the team needs to be painted, the necessary configurations can be executed and translated into executable motor commands.
 
The newest version of GTGraffiti is more synchronous with an artist: An artist draws the graffiti letters on an iPad, which the robot reproduces in (near) real-time. Such a process is challenging, Chen says, because a human artist can traverse the entire length of the mobile device very quickly but the robot cannot.

“The artist gets confused because he makes a circle and then it might take the robot five seconds to catch up,” Chen said.
 
Scale is also a problem when translating art from a mobile device. “Because the iPad is small and the cable robot is big, you have to scale everything by [a factor of] fifty,” Chen said, “but if you scale the length you’re also scaling the velocity so you have to do something to slow it down.”

Engineering Art: Microscopic Needles for Painless Tattoos
 
The stylus on the mobile device can also yield jitters which have to be filtered out. An additional challenge: The cable robot has four cables but only two degrees of freedom leading to over-actuation. “So if we just command the cable lengths or the motor positions then they might pull against each other and tear each other apart,” Chen said. The problem is resolved with high-frequency force control.
 
Wires support and move the robot across the surface while it paints. Wires support and move the robot across the surface while it paints. Image: Georgia Tech
A computer interprets commands from the mobile device and forwards them to a low-level micro-controller that relays position and velocity commands to the robot. The computer runs a Python server that receives pen strokes from the mobile device and performs necessary adjustments, such as smoothing, velocity tracking, calculating, and factoring in past trajectory.
 
Chen expects the team to keep working on adjusting the competing forces and “transforming the motion” of the robotic system so robot and artist can have a more seamless interaction. “We don’t want the artist to draw something and then have the robot paint it after the fact,” Chen said.

“We want them to be working together. We want the artist to draw something, then see what happens on the robot,” he added, “watching this play out might give the artist new ideas as they’re completing the painting–their vision evolves.”
 

Not Just for Tagging Buildings

 
Robot hand and arm movement is also a promising avenue of research in many operations including warehouse sorting and manufacturing. The kind of spray painting that GTGraffiti delivers doesn’t need much tactile feedback but brush painting, which is also a topic of research at the Georgia Tech lab, does.

What's in Your Lab? Underwater Robots
 
The graffiti art project has other significance for the field of robotics, Chen says. It seeks to answer the question, “how can you work with motions that might be coarse and full of noise and transform them into motions that are smooth and suitable for the task you’re trying to solve.”
 
“Trying to tackle these art challenges is one of the ways that we can really try to push the boundaries of robotics and see what it can do,” Chen said.

Poornima Apte is an independent technology writer in Boston, Mass.

You are now leaving ASME.org