Engineering Movement: How Aaron Young and Georgia Tech’s EPIC Lab Use Vicon Motion Capture to Shape the Future of Wearable Robotics

Georgia Tech’s Exoskeleton & Prosthetic Intelligent Controls (EPIC) Lab, under the leadership of Associate Professor Aaron Young, is revolutionizing wearable robotics research by merging biomechanics, advanced control systems, and artificial intelligence. This article explores how EPIC Lab leverages Vicon’s sophisticated motion capture technology to design and test innovative assistive devices such as lower-limb exoskeletons and prosthetics. Vicon’s precision allows the lab to perform detailed, real-time analyses crucial for developing adaptive control systems tailored for stroke survivors, amputees, the elderly, and industrial workers. Supported by NSF, NIH, DoD, and clinical partnerships, EPIC Lab’s ultimate aim is creating intuitive, energy-efficient devices that significantly improve gait biomechanics, reduce metabolic effort, and enhance overall quality of life.

What Aaron Does at Georgia Tech

“I’m an associate professor in mechanical engineering at Georgia Tech and run the Exoskeleton Prosthetic Intelligent Control Lab.” This succinct introduction belies the depth and breadth of Aaron Young’s remit. As director of the Exoskeleton & Prosthetic Intelligent Controls (EPIC) Lab in Georgia Tech’s Woodruff School of Mechanical Engineering, he oversees a multidisciplinary team of engineers, biomechanists, and clinicians who are united by a single aim: restoring and augmenting human movement. “We primarily focus on lower limb movement disability, emphasizing interventions through wearable robotic systems.” In practice, that focus stretches from early‑stage concept sketches to human trials with participants who have complex mobility challenges. “It’s looking at how robots can help people move out into the community,” he says. “We focus a lot on community ambulation and helping people to be able to do daily tasks.”

His lab’s research zeros in on the nuance of human–robot interaction, translating those findings into smarter, more intuitive assistance. Young elaborates, “We pay close attention to how humans interact with these devices, aiming to create improved systems that support mobility, locomotion, and enhanced capability for individuals dealing with diverse lower limb movement disorders.” The team’s signature contribution is an AI‑driven, task‑agnostic control architecture.

“And what we’ve pioneered is this idea of task agnostic control enabled through AI, where basically we can train systems to underline human biomechanics.” Rather than asking the wearer to change modes for every task, the controller watches what the body is doing and reacts in kind. “We estimate internal joint torques and utilize the user’s own internal torque signals to inform these devices.” 

The EPIC Lab captures that philosophy as: “We characterize robotic devices and their controllers thoroughly, from fabrication and bench‑top tests to optimization and detailed biomechanical and performance analysis, powered by advanced machine learning algorithms.” Together, these principles add up to a simple idea with profound implications, technology that gets out of the way and lets people move naturally. 

“Researchers using the lab want to know how humans can control these wearable devices so that they can enable better mobility outcomes as measured through biomechanics and other clinical measures of human mobility,” says Dr Young.

CHAT WITH OUR EXPERT TEAM

Educational Background and Path to Vicon Mocap

Aaron’s route to motion capture and wearable robotics wasn’t linear. “I first did more like cellular tissue kind of engineering work and then as part of my senior design, I did a project on electric-controlled hand prosthesis for amputees.” That single project pivoted his interests from the microscopic world of cells to the macro challenge of human movement and control.

“And so I went to grad school at Northwestern University … in the Center for Bionic Medicine, which at the time was run by Todd Kyken and Levi Hargrove.” Immersed in that pioneering environment, he sharpened his skills in pattern recognition, artificial intelligence and real‑time controller design. “At the Center for Bionic Medicine at Northwestern, my grad school, I learned a lot on machine learning techniques and control systems for these kinds of advanced bionic systems.” Those years taught him how to let data drive decisions, and how to translate that data into motors that act in lockstep with human intent.

“From there, I went to the University of Michigan for a postdoc with Dan Farris in the Human Neuro Mechanics lab. That’s where I got a lot more of the biomechanics training in terms of motion capture and force plates and using that to understand how individuals interact with these devices.” By 2016, when he founded the EPIC Lab at Georgia Tech, Young had woven together biomechanics, AI, clinical insight and systems engineering, a multidisciplinary skill set that still defines the lab today.

How He Uses Motion Capture in His Work & Vicon Volume Setup

To understand and improve movement, you first have to measure it well. Aaron’s lab is built around that truth. “We collect lots of motion capture, force play EMG data in the lab through systems that are not mobile, but we time sync that with our device sensor data”, explains Young. The result is a single, time‑aligned dataset where joint angles, ground reaction forces, muscle activity and exoskeleton torques can be examined frame by frame. The EPIC Lab facility boasts two full Vicon systems, a host of force plates and modular equipment that can be reconfigured to simulate real‑world conditions, including a 40‑camera mix of short‑ and long‑range devices in the overground gait facility.

At the heart of this ecosystem is a custom Vicon volume: the 40-camera setup meticulously arranged to see around stairs, rails and people without sacrificing resolution. Diverse terrains, ramps, irregular surfaces, adjustable stairs, are all inside the capture space, allowing the team to probe how devices behave outside the simplicity of flat laboratory floors. “For us we have 15 Overground force plates plus a Bertec treadmill system that need to be synced as well as instrumented handrails… we have an external sync system in Vicon that does all that.” That external sync ensures everything, right down to a handrail load cell, is stamped with the same clock.

Set‑up is deliberate and thorough. “You go through these procedures and basically… don that motion capture suit to an individual… that process probably takes 45 minutes, I would estimate to get it all correct and done, done well.” Each marker cluster, EMG electrode and IMU is placed with surgical care because a misplaced marker today becomes a mislabelled joint tomorrow. Nexus software then takes over for calibration, masking, labelling and post‑processing, tasks the team has refined into an efficient, repeatable pipeline.

The physical infrastructure amplifies what the cameras can see. A split‑belt Bertec treadmill with incline/decline allows the team to simulate uphill and downhill walking indoors. A 40‑foot walkway hides 18 force plates under its surface, while a turning path adds 14 more to capture complex manoeuvres. A HuMoTech Bowden‑cable emulator lets researchers prototype new exoskeleton behaviours before building hardware. 

Safety is woven in: a ceiling‑mounted Biodex harness provides fall protection without obscuring markers. And when the team needs a fully immersive environment, they step into the Motek CAREN suite next door, 10 Vicon cameras, a 6‑DOF Stewart‑platform treadmill, a 180° projection screen and synchronized Delsys EMG, perfect for perturbed gait studies where the floor can shift and tilt underfoot. “We’ll perturb someone using this treadmill and then look at how they recover, and what kind of strategy they use. Whether it was, for example, a stepping strategy versus just using their joint torques to counter the perturbation,” says Dr Young.

Current Projects Using Mocap

The EPIC Lab’s project portfolio spans the clinical, industrial and recreational worlds, united by a reliance on high‑quality movement data. “We collect lots of motion capture, force play EMG data… run a big AI algorithm that does all the biomechanics in real time and can close the loop around the control system.” In other words, mocap isn’t just for validation after the fact, it’s part of the controller loop itself.

Clinically, the team is refining powered knee–ankle exoskeletons that help stroke survivors recover coordinated gait patterns. By measuring joint torques and ground forces, they can dial in assistance that supports movement without over‑riding the user’s own effort. They also design adaptive prostheses aimed at reducing metabolic cost for amputees, an outcome measure that matters when every watt of energy saved can extend the user’s range and comfort.

On the industrial side, EPIC collaborates with partners who see exosuits as preventative medicine. “People might be apt to injuries from heavy manual material handling… using these exo suits to offload joints that would be prone to injury.” Motion capture quantifies the load reduction and verifies that assistance doesn’t inadvertently introduce risky compensations elsewhere in the body.

And then there are able‑bodied users. “Some might argue that that’s been the most successful thing to date actually has been deploying in able bodied individuals”, Young observes. Athletes and labourers alike benefit from suits that stave off fatigue, support optimal technique, or allow longer training sessions. The same logic extends to older adults who want to keep doing the things they love. “Older adults could use this to extend their mobility later in life or be able to do hikes and other things that might be pretty difficult.” Whether the goal is rehabilitation, injury prevention or performance enhancement, the common denominator is a controller that listens to the body in real time.

SEE HOW MOTION CAPTURE COULD WORK FOR YOU

His Vision for the Future of Robotics

Looking ahead, Young imagines a richer ecosystem of assistive technologies that blur the lines between medical device and lifestyle tool. “I would hope 50 years from now we have a lot more options at our hands in terms of what we could use both for clinical and non-clinical problems compared to what we have today.” In that future, wearable robots would be as commonplace, and as customizable, as today’s running shoes or smartwatches.

He also sees synergy between wearable devices and the broader field of robotics. “There’s certainly a lot of work in standard robotics and humanoids and, you know, different form factor robots that could be used more ubiquitously to help us with living, with working, with caregiving.” Imagine a continuum where an exosuit helps you lift, a mobile robot carries the load, and a home assistant monitors safety and health. “Hopefully there are more general purpose ones that are well-known and available to people if they want to adopt those kinds of technologies.” Still, the road to ubiquity runs through hard problems. “It’s still an emerging technology in a lot of areas – cost effectiveness, reliability, and defining market use cases remain to be determined.” Solving those problems will take not only engineering excellence but also thoughtful policy, user‑centred design, and sustainable business models.

Why Vicon Is the System of Choice

Ask Young why Vicon underpins EPIC’s workflow and he points first to interoperability. “The ability to integrate all the different research things is certainly something that I think partially sets Vicon apart.” Force plates, treadmills, EMG amps, exoskeleton load cells, IMUs, each with its own data format and sampling rate, are funnelled into one coherent timeline. “We just have lots of different systems that we pull into Vicon that all then need to be time synced with all the motion capture.” 

That synchronization means no guessing later about which spike in EMG corresponds to which change in joint angle. Vicon data can be easily integrated with Bertec force plates and the lab’s Delsys electromyography (EMG) system, streamlining multi‑modal studies. Dr Young points, in particular, to the Nexus Plug-in Gait model which allows researchers to make fast progress during the early stages of projects before they move on to building models of their own.

Software is the second pillar. “Gap filling, biomechanics processing, and plugin pipelines are of higher function and quality than sort of some of the competitors on the software.” Nexus doesn’t just record; it computes, cleans and packages data so researchers can move straight to interpretation. On the hardware side, EPIC’s sprawling, obstacle‑rich lab needed a nonstandard camera layout. “We had this very large, very complicated space with different terrain features and stairs that are moving. Vicon engineers helped us optimize camera layouts.” That level of partnership, designing around the science rather than forcing the science into a rigid template, “is a huge aspect or a major consideration of the technology.”

Advice for Newcomers to Motion Capture

For researchers taking their first steps into motion capture, Young’s advice is refreshingly practical. “If you just want to use ‘Plug-in Gait,’ for example, you could do almost your entire workflow within Vicon.” Start with what works out of the box and build confidence before diving into bespoke models or code. He adds that you can manage almost your entire workflow within Vicon without needing additional software.

He stresses the value of an approachable interface. “You don’t need to work through external software like Visual3D.” For many labs, the most precious resource isn’t hardware—it’s time. “If you’re a new user, you can do all of the basic analysis within Vicon. I think that’s pretty unique.” As he adds, “There’s a lot more tools available to help you get through your data easily with a nice GUI… You don’t have to learn a bunch of extra things that you would in other systems.” And when the day comes that your questions outgrow the defaults, Vicon’s pipelines still let you export clean data to MATLAB, Python or custom toolchains.

GET STARTED WITH MOTION CAPTURE

Where Technology, Science and Humanity Converge

Ultimately, the EPIC Lab’s work is profoundly human‑centric, aiming to enhance dignity, independence and quality of life through intelligent robotic assistance. Motion capture is the enabling technology, the lens that makes invisible mechanics visible, while task‑agnostic AI is the interpreter that turns those mechanics into help. By combining precise kinematics and kinetics with responsive control, Aaron Young’s team is nudging wearable robotics from clever prototypes toward everyday tools.

The journey from laboratory innovation to real‑world adoption is seldom straightforward. Yet with meticulous calibration, integrated data streams and a relentless focus on user need, that journey becomes less daunting. More options, more capability, more inclusion, built on data you can trust. EPIC’s story is still being written, but its direction is clear: engineering empathy into mobility, one carefully captured step at a time.

Ready to unlock deeper movement insights? Talk to Vicon about building a motion capture ecosystem that scales with your research.