In October 2018, Vicon loaned a 10-camera system to the Computer Systems Lab at Thomas Jefferson High School for Science and Technology in Alexandria, Virginia. TJ Seniors Alan & Arya used the system for their dissertation, with the goal of teaching drones to swarm so they can be more effective in disaster search and rescue. Thank you to TJCSL for letting us share this project, and to Dr. Gabor, Alan and Arya for participating in the Q&A. Visit the project site to find out more and see the research paper, and watch the lecture and demonstration videos below.
“At TJ, to graduate, students must write a dissertation in a restricted area of their choosing, and Arya and Alan chose computer science. They proffered a complicated proposal: they would cause drones to learn how to fly in groups via deep learning. However, they not only wanted to simulate this but prove that it worked using hardware in the form of small drones. In order to make the testing of the hardware feasible, they would need to work with small inexpensive drones that would not support the weight of additional hardware, such as cameras, to determine the relative location of the other drones. Thus, a motion capture system was a natural choice.
Upon the start of the school, Arya & Alan formalized their proposal, and proceeded to call several motion capture companies in the hopes of securing a proper system for the school. Arya had experience with Vicon motion capture systems from the past summer that he’d spent at the MIT Beaverworks summer program, so it was naturally their first choice. It is fortunate that Vicon saw the merits, too. Looking back, everything came together to make this project successful, most especially the significant investment of time on the part of the two students and Vicon’s generous donation.”
–Dr. Peter Gabor,TJ Computer Systems Lab Director
Q&A with Alan and Arya
Arya: “I originally proposed this project because I was interested in emergent behavior- the idea that a bunch of agents, moving in simple ways, can lead to complex behavior. Think murmuration of starlings, or ant colonies. I’ve continued to hold onto an interest in planning algorithms, machine learning in general, and computer science at large. Moving forward, I want to work on other projects related to learning movement behaviors. This was one of the first projects I’ve done dealing with hardware, and though it was challenging it was rewarding seeing objects physically move around in the world. I hope to continue to pursue robotics research in the future, and perhaps pick up this same project in a couple of years, with more skills to bring to the table!”
Alan: “When Arya told me about his project idea, I was extremely interested. I didn’t have much experience in robotics previously, but I did in artificial intelligence and machine learning through self-study and the school Machine Learning Club. My previous experience with self-learning agents mostly involved simple tasks in OpenAI Gym, so designing one to fly a drone seemed to be an interesting challenge. Moving forward in college, I plan on studying computer science and applied mathematics. I also want to conduct machine learning research in the on Generative Adversarial Networks and novel reinforcement learning algorithms.”
Give us an overview of the project topic and content:
“We noticed the recent increasing popularity of drone swarms in the media lately, particularly the Department of Defense with their Perdix drones and Intel’s drone light show at the 2018 Winter Olympic Games at Pyeongchang. Although they are efficient and highly versatile, they still struggle with path planning and autonomy, needing perfect information to function reliably. We decided to address this issue with machine learning. Our project was first to train a reinforcement learning agent virtually to plan path trajectories for multiple drones in a dynamic space and then to transfer those scripts to real Crazyflies 2.0 flight paths. To track the drones and the obstacles in real life, we used a motion capture system consisting of 4 T40 and 6 T20 Vicon cameras.”
How did you get started with the project?
“Arya came up with the initial idea for the project, and we started the project nearing the end of our junior year. At that point, we had an image of what we wanted our final demonstration to do (fly a cool swarm around), but we didn’t know exactly how to execute it. We were just researching a whole bunch of swarm intelligence methods such as particle swarm optimization, stochastic diffusion search, and even genetic algorithms. We weren’t coding much in the beginning; it was mostly research and emailing various companies for our required hardware. Our project wasn’t going to work if we didn’t have a motion capture system and a workstation or web service to train our deep learning models on. We’re really grateful that Vicon could lend us these cameras this year.”
Did you have any experience with motion capture previously?
“We’ve actually never worked with motion capture before. Arya took a course on autonomous UAV control at the MIT Beaverworks program, and his instructor had mentioned that mocap was the ideal tool to use when localizing drones within a lab. So we were hoping that this project would be a great first foray into using a motion capture system.”
What was the most difficult aspect of setting up the project?
“Probably dealing with hardware issues. Working in code is usually relatively straightforward in that results are usually repeatable. But when working with drones, we had all sorts of strange failures. When we defined the axis of our space wrong, drones would immediately careen off course and crash into walls, because instead of correcting for pitch issues, they would lean into them!
We also had a ton of trouble with keeping the reflective markers on the drones. Any time they hit the ground too hard, the superglued markers would pop right off. Eventually, we resorted to industrial two-part epoxy, and that seemed to do the job better.”
What surprised or interested you most about using the Vicon system?
“After we’d gotten the whole system installed, and fired it up for the first time, I think we both were impressed by how seamless the calibration process was. Once it finished, we looked at the Tracker software and it listed the errors for each camera- with numbers, like 0.4 and 0.6. So we looked through the documentation for the units- I was sure that it was centimeters or millimeters. My mind was totally blown when I saw that the units were in fact millimeters- the system has submillimeter accuracy!
This actually proved to be critical for our final demo. Since there were so many drones in such a small space, the distances between their trajectories were quite short. Such accuracy was required to minimize the chance of collisions. It made our job a whole lot easier since we didn’t have to worry about spreading out the drones’ paths.”
What are your future plans?
Arya will be attending University of Michigan in the fall, and Alan will be going to UVA. Both plan to pursue a BS in Computer Science.