While motion capture is a firmly established medical technology, its use has historically been limited to universities, large research hospitals and major sporting organizations. It was originally used for the clinical assessment of seriously motion-impaired people such as cerebral palsy sufferers. By analyzing patients’ walking gait patterns and upper limb movements in detail, then combining this data with the results of physical examinations, clinicians can understand why and how a patient is moving in a particular way and make informed recommendations for treatment and therapy.
Capturing motion data at the levels of accuracy required to make informed judgments has traditionally needed a fair amount of physical space, technical expertise and analytical skill. For example, specialist cameras have to be correctly positioned and calibrated. Markers must be properly applied to the subject so the system can pick out and monitor the right points on the body. And the data produced by the system needs to be interpreted by someone who has a detailed understanding of what’s significant – and precisely what it’s telling them. But this is changing.
With lower-cost and easier-to-use systems coming to market, the range of applications using motion capture looks set to expand as systems become increasingly accessible.
One area where the technology has been growing is in the design of medical devices like prosthetics that more realistically replicate natural human movement.
For instance, Orthocare Innovations, a developer of advanced orthotic and prosthetic devices, used motion capture to develop its Magellan prosthesis. Magellan mimics key features of a human ankle essential to normal gait. What’s so unique about the prosthesis is that it adjusts and reacts to changes in surfaces and motion (sitting, running, walking uphill, walking downhill, and so forth) giving people better mobility and stability in everyday situations and terrains.
Orthocare conducts product R&D at its gait lab. Here, scientists, engineers, and clinicians conduct gait studies while developing new prosthetic devices, including the Magellan microprocessor-controlled foot. The key to analyzing a person’s gait is to look at how their entire body moves, not only as they walk, but also as they are in a restive state. To this end, the researchers used a Vicon motion-capture system to quantify the movement and forces related to the prosthesis and the person’s body.
Opening up accessibility
10 years ago, it would have taken at least a week or two to train somebody to set up and use a system. Now, customers can be up and running in a day or two. Cameras are becoming smaller and more powerful, meaning less space is required to set up a motion analysis lab. And where the cost of a system capable of producing the accuracy needed to design medical devices would have previously been out of reach for most, it’s now approaching a level that puts it within reach of many more companies. These trends are ongoing, and we are producing systems that are ever more accessible and easier to use.
The ultimate goal for motion capture is for it to sink into the background completely. In two to five years, cameras will calibrate themselves, detect when a subject is in view and begin a capture automatically. Systems will be smarter – able to make their own decisions or present options to the operator. Looking a little further out, to the turn of the decade and beyond, it’s quite likely that algorithms will improve to the point where systems can detect motion accurately from video without the need for physical markers to be placed on the subject (the long-time ‘holy grail’ of the motion capture industry).
Today, there are three gait labs in the world operated by prosthetic companies—and one in the US — making motion capture the preserve of the most advanced companies developing the most sophisticated devices. But as motion capture systems grow smarter, smaller, simpler and cheaper they will become far more widespread, with the potential to bring huge benefits to the development of medical devices.