Robots Modeled on Bees Sense Rather than Think
May 11, 2018
by Eric Butterman ASME.org
Many people run in fear from the common honeybee, but a joint project between Cornell University and Harvard University is embracing the insect. Specifically, Professor Silvia Ferrari, a mechanical and aerospace engineering professor at Cornell who is part of the project, sees a great deal of robotic potential in them.
Ferrari says the project, which originated at Harvard and has the “RoboBee” as its centerpiece, comes from the work of people like Robert Wood, a professor of engineering and applied science at Harvard.
“He has been a part of a new way to fabricate robots through folding and unfolding technology, similar to origami but allowing for the ability to fabricate everything at once: the chips, the sensors, the actuators, and more, by basically stamping this device into flat-like panels,” she says. “Once it’s all fabricated, [the robot] unfolds all the pieces like origami and pops up with all the pieces in place.”
Robots right now have to ‘think’ about what to do, using all that processing. This way can allow them to avoid that. Prof Silvia Ferrari, Cornell University
For You: Learn the latest about breakthroughs in robotics.
The computing power needed to truly mimic a bee would be immense, and would require far too large a system to fit into a small robot. So Ferrari’s group has tried to add to the project by delving into neuromorphic systems and CMOS resistors. The idea behind their efforts is to build chips that function more like biological brains. They need much less power and can be built at a smaller scale, with sensors processing but using a lot less data, she says.
“You just have too much power being used when it doesn’t have to be,” she says. “Robots right now have to ‘think’ about what to do, using all that processing. This way can allow them to avoid that. Look at the cockroach. It has hardly any brain and it’s very good at avoiding predators.”
Airflow sensors, which look like hair, similar to an insect, are used to detect information about direction of the airflow over the wings because one aspect of the project is to try to make the RoboBee robust enough to fly through strong wind gusts, for example. It has two wings that can be independently actuated and there are also sensors onboard that detect changes in luminosity.
For sensor motor processing, RoboBee does not only possess normal control systems but sensor processing algorithms as well, to use both proprioseptive (responding to internal stimuli) and exteroceptive (responding to external stimuli) sensory feedback to control the robot. They want to focus on safe navigation and tasks such as searching for a target, perusing the target, and following the target, she says. Meaning the applications of the robot are numerous.
“Also, onboard data has to be processed and you see these colorful algorithms but they are very slow in producing the answers that can be used to actually move around,” she says. “Because of this, robots are very far off from the sensory motor skills and sensory motor learning that you see in humans and in animals.”
The next step for the group is to further develop the control algorithms and work on acrobatic maneuvers, for example, a moving surface like a window or door that’s swinging where the RoboBee can try to get through.
“Even for conventional robots, underwater vehicles, and drones, they have more power being used than should really be required for their missions,” she says. “Less computational power means possibly smaller batteries, smaller robots, and less onboard computing hardware—also letting more computation onboard be devoted to other tasks. There is the potential to have entirely different robots in the future.”
Eric Butterman is an independent writer.