by Murat Berme, CEO of Bertec
The robotics revolution in human mobility isn’t being led by the robots. It is being led by the scientists quietly capturing how humans move. Prosthetic legs that adapt to uneven ground. Exoskeletons that stabilize balance before a fall. These technologies continue to advance, but their breakthroughs are not primarily about better hardware. What matters most is data. High-fidelity, carefully measured data on how real people walk, climb, stumble, and recover.
“In their daily lives, runway models do not walk as they do at a fashion show,” shares Dr. Necip Berme, Founder of Bertec. “Similarly, test subjects in a lab setting do not walk as they walk outside the laboratory: a lesson I learned in the 1970s when observing test subject amputees as they left the Bioengineering Unit in Scotland where I worked.”
Dr. Berme further explains, “My physiologist colleague suggested that people optimize their walk, normally to minimize energy expenditure; but when observed, they try to improve their appearance. He even had data to support this observation.
It is a great idea to collect force and motion data to promote the development of better biomechanical models. However, one needs to make sure that data is collected during natural movements.”
For all the attention placed on sleek devices and new materials, the truth remains simple: before a robot can assist human movement, it must first understand human movement. And that understanding is being built in biomechanics labs around the world.
The Work Happening Inside Biomechanics Labs
Across research institutions globally, scientists are capturing how the human body moves in both controlled and adaptive environments. Laboratories are studying everything from healthy gait to post-stroke rehabilitation, from lower-limb prosthetic adaptation to fullbody exoskeleton assistance.
For example, researchers at the University of Pittsburgh’s Sensorimotor Learning Lab are exploring how people adjust their walking patterns when each leg is challenged differently. Using synchronized motion capture, force plates, and muscle activity measurements, they examine how individuals adapt their coordination, balance, and step timing in response to split-belt treadmill conditions. These findings inform how robotic devices might adjust dynamically as a person’s movement changes.
At the University of Utah’s Human Gait and Neurorehabilitation Lab, researchers are studying how powered knee exoskeletons assist joint motion in everyday tasks. With precise measurements of joint angles, ground forces, and muscle activation, they assess how devices can support walking without increasing fatigue or disrupting natural movement. One participant described the experience as feeling like their muscles were “fused” with the device, highlighting how subtle the interaction between human and robot can become when tuned properly.
In both cases, and in many similar labs worldwide, researchers rely on advanced motion capture systems and instrumented treadmills to gather synchronized data. Tools such as multi-camera optical tracking systems and high-resolution force platforms allow for submillimeter tracking of limb motion and highly sensitive measurements of how the body interacts with the ground. These technologies serve not as the breakthroughs themselves, but as the instruments that make deeper understanding possible. Felix Tsui, Product Manager at Vicon Motion Systems, emphasizes this point: “The beauty of motion capture is that it allows us to map the nuances of human movement with precision, providing the raw data that robotic systems need to mimic and support natural biomechanics.”
From Observation to Intelligent Control
The value of these datasets extends far beyond observation. Increasingly, they form the foundation for adaptive control algorithms. Early robotic prosthetics and exoskeletons followed fixed, rule-based patterns, but newer systems are moving toward real-time adjustment based on how users actually move. “This is where movement capture is at its most powerful,” notes Felix Tsui, of Vicon, “In its ability to feed algorithms with detailed kinematic data on demand. Whether that is part of larger data sets or in real-time to examine a person’s or device’s response. This capability is redefining how machines interact with the human body, making movement not just measurable, but meaningfully interpretable.”
Some research groups are using neural networks trained on joint kinematic data to predict transitions in gait. Others are developing controllers that adjust joint torque output in response to measured ground forces, allowing robotic systems to respond to user intent rather than pre-programmed sequences. This shift from rigid programming to adaptive, data-driven control is essential for assistive devices to feel natural and intuitive.
Crucially, these models are only as good as the data they are trained on. High-fidelity biomechanics data allows systems to learn not only steady-state walking, but also the complex transitions, recoveries, and micro-adjustments that define real-world movement.
Simulation as a Powerful Complement to Real-World Data
As biomechanics research expands, simulation has emerged as a valuable and necessary complement to measured human data. The ability to create virtual environments allows researchers to model countless variations in gait, balance recovery, and environmental complexity that would be difficult, or in some cases unsafe, to repeatedly test with human participants. These simulated datasets accelerate controller development by allowing systems to encounter millions of scenarios at scale.
Simulation thrives when built on strong real-world data. The precision measurements captured in biomechanics labs, from motion capture systems tracking joint movements to force platforms capturing ground interaction, establish the biological foundation that makes simulated environments meaningful. These empirical datasets don’t compete with simulation, they enable it. Once grounded in real measurements, simulation can expand the range of conditions far beyond what would be practical or safe to replicate in human testing, allowing controllers to encounter rare or extreme scenarios and refine their performance across a much broader spectrum of movement possibilities.
What makes this moment particularly promising is the growing effort to expand real-world datasets beyond healthy individuals to include stroke survivors, amputees, Parkinson’s patients, and others whose movement patterns differ from normative baselines. Incorporating these clinical populations strengthens both real-world models and simulations, creating more capable, adaptive systems that can serve a broader and more diverse range of people.
The Bottleneck Is Not Hardware
Motors, batteries, sensors, and materials have all advanced rapidly. What limits progress today is not the physical components, but rather the depth, diversity, and precision of the movement data used to train robotic control systems.
Biomechanics labs have made enormous progress in building these datasets, but access remains limited. Many clinical populations remain underrepresented. Movement patterns involving fatigue, instability, or long-term adaptation are often insufficiently sampled.
The devices we build will only be as good as the variety of human experience reflected in the data. The next generation of mobility devices will not simply require better hardware. It will require better data, data that fully reflects the complexity of how people move, adapt, and recover in real life.
Building the Foundation for What Comes Next
The path forward is clear. If we hope to build assistive devices that truly move with people rather than for them, we need sustained investment in the science of human movement itself.
That means expanding access to biomechanics research infrastructure. It means creating larger, more diverse open datasets that combine real-world measurement with validated simulation. It means fostering collaboration across labs, clinics, and technology developers so that the tools we build reflect the realities of the people who will use them.
The robotics will follow. But first, we need to better understand the human body that they are meant to assist. Without that foundation, the most advanced exoskeleton or prosthetic will still fall short of its promise. With it, we have the opportunity to fundamentally redefine what mobility assistance can be.