With the Help of AI, Bipedal Robot Learns to Run

With the Help of AI, Bipedal Robot Learns to Run

With a focus on using machine leaning to optimize a robot’s gait, Oregon State University engineers set a Guinness World Record by a bipedal machine completing a 100-meter dash.
The skies—or maybe the stairs—are the limit for this team at Oregon State University’s (OSU) Dynamic Robotics Lab that took a systematic, machine learning approach to robotic gait optimization. Their work, published by the Institute of Electrical and Electronic Engineers, “Optimizing Bipedal Locomotion for The 100m Dash With Comparison to Human Running," looked to improve the speed of a bipedal robot using AI to train it to be a better runner.

Initially, OSU professor Jonathan Hurst and his team used a 16-month, $1-million grant from the Defense Advanced Research Projects Agency (DARPA) to develop the robot Cassie. Several years later, starting from and ending in a standing position with no falls, Cassie broke the record, sprinting in 24.73 seconds using no cameras or sensors on OSU’s Whyte Track and Field Center.

Become a Member: How to Join ASME

Robots—a mix of hardware and software—are limited by each, said Alan Fern, professor in the Collaborative Robotics and Intelligent Systems Institute at OSU’s Electrical Engineering and Computer Science department. He explained that the team had tried to control their robots with traditional analytical, numerical optimization techniques, and they got things to work—to some degree.

Agility Robotics launches Digit, a bi-pedal robot with a human form factor made for work. Photo: Agility Robotics
“But then, if you say, OK, I want this robot to skip, to hop, to run, to run fast, and to go up and down stairs, you will end up spending a lot of time working out the equations, which is really tedious,” he said. The team decided, instead, on a new approach. They turned to using AI to train Cassie.

Using reinforcement learning to get a machine to perform the way you want it to involve a neural network with a controller that is taking input from the system (no external sensory information), which in this case is joint angles and velocity of those joints. “We call it proprioceptive control input,” Fern explained.
 
Instead of working out all the equations and solving optimization problems for this controller, the team gave the neural network “an environment where we can give it a reward signal that tells it when it's doing something that we want, or closer to what we want, or it's not working at all. So if it falls down, we give it a negative reward, for example,” he said.
 
The neural network trains for what can be millions of steps and eventually converges to a good controller. The team created a simulation to do the training because when a robot falls down thousands of times, it would eventually break before it learns what it needed to do. The system also has the ability to provide multiple simulations at the same time, saving time.
 
More for You: “ChatGPT, Make a Robot”
 
Because robots trained in a fixed simulator with fixed physics parameters would not be robust in the real world, the team also uses randomization. Such factors include the center of mass of the robot and the friction coefficients. And, “we randomize the terrain as well,” Fern said. In responding to its environment, the robot is learning to navigate these randomized physics parameters. The idea is that when it is out in the real world it will be able to respond robustly to whatever the real world offers.
 

Cassie mechanics

 
Of course the mechanics of the robot need to perform and that was done by Hurst, who is a mechanical engineer, and co-founder of Agility Robotics. Inspired by animal locomotion, he worked hard to get the best pair of robot legs to increase the chances of great mobility. Most robot legs are fully actuated. There's very little passive dynamics involved and so the motors are always engaged. But with animal locomotion there's a lot of passive dynamics. There's a lot of springiness.
 
The team took this idea, a system that allows energy to be stored and released, and used it to its advantage. The design—which took years—finally looked a lot like bird legs. These mechanical legs can be coupled with controllers to do basic walking on really flat terrain. Kind of like brain and body working together to move. The team recognizes that good software cannot overcome a bad physical design, so both are very important when it comes to overcoming current and future limitations.
 
Fern categorizes the robot’s progression as facing and overcoming challenges. Before this challenge of running fast the robot ran a 5K. “Once you start going at faster speeds. This gap between simulation and reality grows larger. It's just harder because you're pushing the machines closer to their limits, and that's where the gap shows up more and we also didn’t appreciate just how difficult it is to actually stop,” he said.
 
The team is now focused on moving Cassie from blind locomotion to navigating the world as it senses it. “If you think of a human, if you close your eyes, you can do pretty well at locomotion,” he explained. “Like you can go up and downstairs if you're brave enough.” But there are limitations to what you can do with your eyes closed. Outfitted with a camera, Cassie is working on visual locomotion. “We are working on things like navigating steppingstones,” Fern said. “You have to precisely decide where you put your feet.”
 
Also, Cassie can climb stairs, but sometimes stumbles. So the robot that started without sensors will begin to learn from the outside building on what it learned from simulation, adding the new sensory modality of vision. Because just transversing from curbside to front door is not easy, for example. There’s different terrain to navigate and objects to avoid. “There has to be a basic level of common sense, for example, avoid walking across flower beds,” he said.
 
The work that a robot like this can eventually do are those repetitive jobs that a robot can do easily, such as moving boxes in a warehouse, removing junk on a military base, or carrying another robot (think vacuum) from one level to the next. And the robot needs to adapt to the world made for and by humans—tires on a robot, won’t fly, for example in a world full of stairs. They also need to respond to regular ways humans communicate, using verbal commands.
 
“We basically want Cassie to be able to go anywhere in a building or outside,” Fern said. “It used to be that people thought humanoid robots are kind of just some novelty. But if you think about what it is, it can help with current and future labor shortages, and much more.”

Cathy Cecere is Membership Content Program Manager.


 

You are now leaving ASME.org