iPhone Apps for Robot Control
Sep 19, 2011
by Chitra Sethi Managing Editor, ASME.org
We have seen thousands of iPhone applications and can expect to see even more as Apple marches closer to its fall release of theiPhone5. There are multiple apps available for iPhone, iPad, or iPod to remote-control gadgets such as your computer, DVR, digital camera, or even your car. Have you imagined controlling a robot in a manufacturing plant using an iPhone?
Dr. Vikram Kapila, professor of mechanical engineering at the Polytechnic Institute of New York University, and his students are developing apps that allow you to control robots with just a tap or flick of a finger.
Currently in the usability testing stage, the iLabArm and iLabBot are part of one of their projects started over two years ago to monitor, command, and control a variety of laboratory testbeds using iPod, iPhone, and iPad over an ad hoc Wi-Fi network. "Our goal was to take advantage of an iPhone's or iPod's touchscreen to create an intuitive interface for interacting with physical devices such as robotic manipulators, mobile robots, and also our own control lab experiments," says Dr. Kapila.
The iLabArm not only uses the touch feature of the iPhone but also embedded sensors—the accelerometer and gyroscope—to move the robotic arm. As the hand is tilted forward and backward, the iPhone's accelerometer data is used to command the wrist-joint to tilt forward and backward, respectively (see videos). The iLabBot can control a mobile robot (Qbot, based on iRobot's create platform). As the user touches a specific area on the iPod screen, the Qbot in the real world is commanded to the corresponding location.
Command, Control, and Monitor
When Dr. Kapila and his team started the project in the summer of 2009, the hardware wasn't available in a small footprint and the software features on the iPhone and iPod were not openly available. One of his research students, Jared Alan Frank, addressed the problem by using an open sound control (OSC) protocol. OSC allows a user to interface computers with electronic musical devices such as synthesizers. "The idea was to send the OSC command from the iPhone or iPod to a MAKE microcontroller, which would then interpret it to command motors [instead of playing music] and other activities of the robot to control its behavior," says Frank.
The experiments took several months and attempts during which Frank worked with high-school teachers under NYU Poly's Research Experience for Teachers (RET) program, funded by the National Science Foundation. Part of the program required teachers to work in the lab over a six-week period to learn mechatronics and do research. "We first tried this with an off-the-shelf monster truck that you normally control through radio control and then we used iRobot's Create platform and controlled it through the iPhone," says Frank.
One of the key challenges of the project was that they weren’t able toreceive any messages from the robotic platform back to on their iPhone app. "You don't just want to control the hardware, but also be able to monitor it," says Dr. Kapila. In spring of 2011, they were able to find hardware solutions that were more amenable to this development. Since then, the team moved its development to Arduino microcontrollers interfaced with Wi-Fly shields to create a local ad hoc wireless network. Both of these are smaller than the palm and are mounted one on top of the other. "That allowed us to make a more acceptable footprint of the hardware that one could easily connect to the robotic platform with a total cost of less than $100," says Dr. Kapila.
Using hardware and software thatwere already available, it took the team a short time to work through their software protocols and iPhone appsto not only command their robotic projects but also get information back from the projects onto the iPhone or iPod.
Robots in Real Life
The next step was to create an easy-to-use graphical user interface for the apps. "We made an interface that allows a user to control a robotic arm. As the user holding an iPhone tilts or turns his or her wrist, the iPhone’s accelerometer data is used such that the robot’s wrist mimics the wrist motion of the user (see videos). Moreover, as the user enacts a pinching action on the iPhone screen, the robot gripper is opened and closed,” explained Frank. These apps can be used in the medical device field or in a manufacturing plant for a lab automation exercise to control robotic arms of any size.
Besides the engineering field, there's plenty of potential for such mobile apps to be used in themilitary, says Dr. Kapila. He says the U.S. military has a mandate that by 2025, 25% of its vehicular systems (air, water, and ground) must be robotic. "Imagine a soldier in a desert with a robot and heavy gear to control it in his/her backpack," he says. "The same soldier probably has an iPhone for keeping in touch or for listening to music. What if he/she could use the same device to control the robots?"
"Such tactile and intuitive interfaces are needed if we want robotics to become common in our society," says Dr. Kapila, who believes that one day robots will work along with humans in workplaces, homes, grocery stores, museums, parks, etc. "If a housewife or a grocery store employee, who don't have a scientific or engineering background, are going to use robots every day, they have to be able to interact with these devices almost intuitively, just like the two- to three-year-old kids using iPhones and iPads these days."
Such tactile and intuitive interfaces are needed if we want robotics to become common in our society.Dr. Vikram Kapila, professor of mechanical engineering, Polytechnic Institute of New York University