Fluent in the Many Languages of Motion Capture

Texas A&M’s Starlab has a truly cross-discipline mocap volume

Download Case Study

Fluent in the Many Languages of Motion Capture

Texas A&M’s Starlab has a truly cross-discipline mocap volume.

The presence of a motion capture lab on a university campus isn’t unusual, but the scale and scope of the work Starlab does is. Where most university mocap volumes focus on one particular field, Starlab has projects ranging across arts and visualization, aerospace, mechanical and electrical engineering, kinesiology, agriculture, robotics and autonomous vehicles.

Texas A&M University’s RELLIS Starlab Facility exists to support and enable students, faculty and external partners in learning and researching motion capture and augmented reality technologies. “My goal is to build a collaborative space where cross-pollination across disciplines can happen to push the boundaries of technology and creativity,” says Professor and Starlab Director Michael Walsh, a veteran of the VFX industry.

To facilitate that goal, Starlab has a cutting-edge, 2,000-sq-ft capture volume equipped with 44 Vantage V5 cameras and additional custom-built 5MP, 350 FPS Vantage cameras, as well as an array of other sensors including LIDAR and RADAR. Crucially, for the variety of disciplines across which lab users work, the space is equipped with three of Vicon’s four processing platforms – Shōgun, Tracker and Nexus.

Simply having the equipment for cross-disciplinary motion capture is not enough, however. “I think there can be a language issue,” says Walsh. “With users coming from so many academic disciplines and industries, we’re speaking different languages, using different terminology, and that can hamper the cross-pollination and collaboration.”


Those language problems have not stopped a diverse range of projects from being realized at Starlab. Beyond the leadingedge technology that the lab boasts, the key to this diversity is in its culture, according to Dr. Zohaib Hasnain, assistant professor and collaborator at Starlab. “We got to this point, more than anything else, by keeping an open mind,” he says.

“A lot of these labs in academic settings are highly focused in nature. You go to them, and they’ll perhaps entertain you if you’re doing something exactly along the lines of what they’re doing, or something that doesn’t require too much modification. One of the things that we pride ourselves on, though, is that our setup is meant to be fluid.

“It’s meant to be, ‘okay, you tell us what you’re doing, and we will tell you how we can make it better.’ And that’s what’s propagated through the campus community. That’s what’s brought a lot of people here.”

The team around Starlab is equipped with such a broad range of finely-honed expertise that, regardless of discipline, they’re able to make a project work. As Walsh puts it, “It’s not just the stage, it’s the team. You get the whole package.”

This approach has led to a wideranging slate of projects using the lab, covering everything from emergency services training to robot development for the agricultural sector.


Dr. Hasnain notes that the team was tapped by a large architectural firm involved in city planning on projects that won’t be completed for 20, 40 or even 50 years. “So one of the things that they came to us for was one of those hospitals that’s going to see a traffic of about 100,000 to 500,000 people a day,” he says.

The particular problem they needed to solve was accounting for changes in transport. “The consideration is that five or 10 years from now, autonomous vehicles will be commonplace. And they’ll have this huge volume of people that are coming in and out, and how should they design their parking garage to accommodate the fact that vehicles would be able to drive themselves, and perhaps allow for more efficient ingress and egress and alleviate traffic stress?”

The solution, executed at Starlab, was to create a miniature parking garage in the space and model autonomous vehicle traffic flow with remote-controlled vehicles, using motion capture in place of GPS.

Dr. Hasnain also mentions a new project that is focused on the agricultural sector. Starlab is working with a professor interested in solving a unique problem: in order to test grain, farmers have to lower themselves into huge grain silos, but because there’s no stable footing, it’s not uncommon for people to fall in and to effectively drown. To address the problem, the professor and Starlab are using the Vicon system to investigate what sort of robot might work in such an environment.

Yet another problem that researchers are working on at Starlab is the biomechanical challenge presented by Parkinson’s, while a further group has used the lab to develop an early-stage powered limb prototype for above-the-knee amputees, which was built for just $500.

The list of discipline-spanning projects goes on, and Starlab shows no sign of stopping its pioneering work. One avenue that Walsh is interested in pursuing is location-based VR for training. “We’ve been having a lot of discussions with emergency services to help them with their training,” he says. “Both military and police personnel are accustomed to handling weapons, and it is essential that any VR simulation be realistic. The weight and feel of the weapon and the real-time response by the user must match to ensure that those guys will be willing to accept it in the simulation. That’s the kind of thing that you get with the Vicon system, with the latest cameras, because all the computing is done onboard the camera, so it’s all super-fast, the latency is very low. And that’s all-important for training.”

The team is unlikely to stop there, however. “It’s just tip-of-the-iceberg stuff. It could be so much more in-depth than that,” says Walsh. The main stumbling block in the application of LBVR to training is usually, he says, “a lack of understanding of what could be possible by those not accustomed to the capabilities of the technology.”

That’s not a problem you’ll encounter at Starlab. As the wide-ranging use of the facility in other fields shows, Starlab’s bread and butter is using cross-discipline thinking to push the applications of motion capture technology in unexpected directions. Imaginative, open-minded thinking is at the heart of what Starlab does.