- No products in the cart.
SIX GEORGIA TECH MOTION CAPTURE PROJECTS THAT ARE PUSHING THE BOUNDARIES OF AUTONOMOUS FLIGHT
Across the Georgia Tech School of Aerospace Engineering’s three motion capture volumes, faculty and students alike are finding new tracking applications for creative projects that explore flight both on and off Earth.
The newest of the three spaces, the Indoor Flight Laboratory, boasts 56 cameras alongside a projection system and wind simulation equipment across its 2000 square foot volume, though all three house their own state-of-the-art technologies. The facilities are home to projects that range from the playful to the potentially groundbreaking, and lab manager Lee Whitcher has a front row seat for all of them. Whitcher spoke with Vicon about six of the most interesting projects using motion capture in his labs.
“Zhiyuan (Nick) Zhang, a student in Professor Panos Tsiotras’s Dynamics and Control Systems Laboratory, has been working on trajectory control, which is where you define a trajectory in space,” says Whitcher. “It’s a large series of coordinates that define a line, maybe through three-dimensional space, and then you control your aircraft to match that trajectory. You also control not only your position on that line, but your orientation. So at certain points, you could actually be upside down on that trajectory.”
The goal is to improve drone control. Whitcher notes that for 20 years, designers of autonomous drones have been working on basic problems such as keeping their vehicles level and ensuring consistent location tracking, but that developments such as drone racing are making new demands of the technology.
“Trajectory control is one way to do aggressive flying,” he says, “and it’s actually quite a mathematical problem because you can’t define a trajectory that the aircraft can’t follow. So you have to have a good model of the aircraft, so that you can actually come up with a good trajectory that meets your maneuverability requirements, but also one that the aircraft can achieve.”
“This is a really strange and interesting aircraft,” says Whitcher. The drone in question, designed by student Kevin Webb in Professor Jon Rogers’ Aerial Robotics and Experimental Autonomy Lab, isn’t actually a single aircraft.
“It’s four cooperative transportation drones,” says Whitcher, “which are basically four quadrotors that are all attached to a rigid frame, and those four quadrotors can pivot on the frame. So, essentially it’s one giant quadrotor made out of four small quadrotors. Each of them is controlling its own angle on the frame, which is then giving you flight control of the whole aircraft.”
Another unconventional piece of aircraft design in the Georgia Tech flight labs is the work of Whitcher himself.
“It’s the flight control development for a drone called CEMAV (Coanda Effect Micro Aerial Vehicle) manufactured by a UK company named Aesir. This is my thesis research aircraft,” says Whitcher.
The drone, which resembles a flying saucer, usually flies outdoors with a gasoline engine. “But I converted it to electric and both free-fly it and run it mounted on a three degree of freedom gimbal rig so it can ‘fly on the spot’ in the Vicon room,” says Whitcher. “This enables the control algorithms to be dialed in without risking crashing, or reliance on GPS.”
“We have a student, Emily Glover, whose advisor is department chair Prof Mark Costello, who basically inflated a parafoil in the lab and then used markers all over it to measure it,” says Whitcher. “That enables her to do computational fluid dynamics on the parafoil, which in turn enables her to basically do wind tunnel testing without the wind tunnel.”
“You can do it in the computer, but for that you need accurate geometry,” he adds, “and how on Earth do you measure a parafoil if it’s not without a system like this?”
Whitcher’s next project is based in a lab that addresses the challenge of mimicking extraterrestrial conditions from firmly within Earth’s gravity well. In the Autonomous Spacecraft Testing of Robotic Operations in Space (ASTROS) lab the School of Aerospace Engineering has a surface that Whitcher jokingly refers to as “a glorified air hockey table”. It’s actually a very sophisticated piece of engineering that enables equipment to move in two dimensions without friction, helping users to simulate the motion of spacecraft.
The ASTROS test platform is a robot that consists of two stages. The lower stage creates the compressed air cushion that the device moves on, while the upper section is attached by a three-axis air bearing that enables it to roll, pitch and yaw. Between them, they can simulate very low-gravity conditions, enabling users to test algorithms for autonomous proximity operations such as the refueling of satellites.
“They use the lab’s Vicon system to know what their orbit is, effectively,” says Whitcher. “Right now, they’re doing a photogrammetry project. They have a fake asteroid in there that is mounted to another robot arm, and that asteroid rotates and moves around the space. The satellite tracks it and moves around it, taking lots of photographs which are then stitched together to make a 3D model of the asteroid.”
The final project Whitcher demonstrates is in the Computational Solid Mechanics Lab run by Professor Julian J. Rimoli. “He has a Vicon system that looks at morphing bodies,” Whitcher says.
He gestures to a simulation of a tensegrity planetary lander, funded by C-STAR and NASA. “It’s a series of carbon fiber rods that are interconnected, and the whole body morphs. It can absorb landing impact and it also enables you to roll with very low energy usage,” says Whitcher.
The lander could be used to drop objects such as sensors onto other worlds. Whitcher says that it could even house a rover that would shed the structure to continue its journey.
“They have Vicon markers on every joint,” says Whitcher, “and that enables them to actually see the deceleration, and therefore how much energy is getting dissipated throughout the impact.”
For more on how Vicon systems are facilitating extraterrestrial travel, see our feature on Frances Zhu’s project applying machine learning to autonomous rover operations.. For more on Georgia Tech’s innovative uses of motion capture, see our piece in The Standard 2020.