Wearable Exoskeleton Uses AI to Assist Human Movement

Wearable Exoskeleton Uses AI to Assist Human Movement

An AI-controlled exoskeleton developed at Georgia Tech can reduce biological joint effort by providing assistive torque at the hip and knee.
Seniors in the future may be able to move more easily without relying on the help of bulky assistive devices. People recovering from stroke or living with mobility impairments could regain some lost function. These are among the applications a research team at the Georgia Institute of Technology is targeting with their work on AI-enabled exoskeletons integrated directly into clothing.

Aaron Young, associate professor of mechanical engineering, and his team at Georgia Tech shared that this type of system is very different than traditional medical exoskeleton systems, which are “very large, usually used with either a walker or crutches,” according to Young. “The unique niche we’re really trying to target is people who have some capability but want additional movement, so the [exoskeletons] are much lighter weight, and this device is integrated with clothing.” 

This would mean that instead of strapping on a bulky exoskeleton, the future of robotics improving accessibility might be built into a pair of pants as a soft and seamless robot.  


Training the exoskeleton brain to move like a human

To use AI to train an exoskeleton to move like a human requires a working “brain.” For exoskeleton companies, building a brain like that requires many hours, specialized equipment, and lab participants. That’s the work Georgia Tech’s team has done, aiming to expedite the process and increase efficiency for those companies looking to bring lightweight “smart” exoskeletons to market.

Aaron Young leads the research team developing AI-controlled wearable exoskeletons integrated into clothing. Photo: Georgia Tech
“On the engineering or technical front, our big goal is to develop something we would call task-agnostic control—a system capable of delivering useful support across all the various sectors,” Young said. The same controller supports walking, standing, and stair climbing without having to switch modes. “It’s always estimating what are these internal efforts of the joints and wrapping the control around that, even if they're doing transitions between tasks.” 

The result is a 20 percent increase in the user's movement capabilities. This would mean a person trying to get up a stair or off the couch would experience a boost in power and mobility from that specific part of the body with the device.


The technology behind it

In November 2025, Young and his team published a research paper on the topic in Science Robotics, explaining their approach to building the AI-based “brain” capable of hip and knee movements. 

“The unique form of AI, and that’s what this [paper] is focused on, is that we've trained a deep learning system that's end-to-end. It takes all the sensors on the robot and basically learns the state of the human joints. Then it scales that by 20 percent,” Young said. The end-to-end learning system fully encapsulating the state of human joints is the innovative part.

Discover the Benefits of ASME Membership

What makes the device stand out is its integration into clothing, he said, unlike traditional exoskeletons, “which have large, rigid metal components.” While the pants are mostly a stretchy yoga pant-style material, they incorporate one carbon fiber strut along the lateral side. The specialized fabrics transmit assistive forces while maintaining comfort. “It's still a softer material, but it's not like standard pants that would deform a lot,” Young said. He highlighted that the telescoping degree of freedom between the knee and hip allows for a greater range of motion than other conventional exoskeletons.


Building the brain

Young explained that the system is built on a CycleGAN neural network architecture, which was originally developed for image-to-image translation tasks. The model can translate sensor data into estimates of internal joint torque. By analyzing how people move without an exoskeleton, it learns patterns of joint effort and uses that knowledge to determine how much assistance to provide a human wearing the exoskeleton. This helps the system’s “brain” control movement in real time.

“Think of the translator as an autoencoder that goes from one language to the other,” Young said. “But then you also have this thing called an adversarial classifier that tries to determine, ‘is this from the original language, or is it from the translated language?’ Once that's translated and I have all these simulated sensors, I have linked internal biomechanical states. I can then just train a joint moment estimator, or basically any estimator that goes to the internal state of the human.”

The team tested an AI method that uses existing movement data to build exoskeleton controllers without lengthy device-specific training. Photo: Georgia Tech
The team first explored the use of translators, controllers, and sensors by using a phased approach to evaluating human movement. In research published in November 2024, participants were asked to complete a variety of movements, including lunges, jumping in place, and using stairs. The results from this research gave them the ability to detect and estimate how internal joint movements are used in different efforts.

“The idea is to make it so that it can't tell the difference between the translated data and the original data,” Young said. “And if you train these well enough and for long enough, you can largely trick the translator or trick the classifier, and then you have a very good translator. Essentially, that's the mark of a very good translator—that it can no longer tell the difference between the original and the converted language.”

With ongoing research, the team has been able to further expand the technology guiding their work, including adapting how they train the sensors. But Young acknowledged that further guidance is still needed.

“What we really worked on was a method of training these systems that does not require device-specific data time synchronized with all the motion capture and force plates,” Young said. “Can we fine-tune the model to the subject as they're walking? And the answer so far has really been a resounding no, mostly because we can't seem to estimate their internal state any more accurately than what the original model would do.”


The future of AI and exoskeletons

Next steps for Young and his team include more research and development, cost analysis, and approvals.

“The goal is to eventually get the device FDA approved for medical use, which would require extensive clinical trials and regulatory paperwork,” he said, noting that FDA approvals would still be years out. Other future hurdles include cost as it moves from a prototype to a product available to consumers. 

You Might Also Like: Robot Assistance that Starts at the Knees

There are also plenty of next steps in the product development process. “What’s awesome is we have a good, universal controller that works for everything, but it’s not optimal for anything," Young said. “We absolutely have lots of room to continue to improve, continue to provide value by really optimizing.”

He remains optimistic that its unique design and features will prove helpful and worthwhile for patients who need the technology the most. “Certainly the idea that you could easily put it on, hopefully in less than a minute, and with just one hand, that’s really what we are funded to do,” he said. “That’s kind of our mandate—to make this realizable.”

Alexandra Frost is a freelance writer and content strategist in Cincinnati.
An AI-controlled exoskeleton developed at Georgia Tech can reduce biological joint effort by providing assistive torque at the hip and knee.