MIT Develops VR Testbed for Drones

MIT engineers have developed software to reduce the number of crashes that drones experience in training.

Leah Zitter
Read +
Follow Us

MIT engineers have developed a new virtual-reality training system for drones that enables a vehicle to see a rich, virtual environment while flying in an empty physical space. The system, which the team called “Flight Goggles,” could significantly reduce the number of crashes that drones experience in actual training sessions. It can also serve as a virtual pilot session for training fast-flying drones.

“We think this is a game-changer in the development of drone technology, for drones that go fast,” says Sertac Karaman, associate professor of aeronautics and astronautics at MIT. “If anything, the system can make autonomous vehicles more responsive, faster, and more efficient.”

Karaman and his colleagues will present details of their drone virtual training system at the IEEE International Conference on Robotics and Automation next week. “The drone will be flying in an empty room, but will be ‘hallucinating’ a completely different environment, and will learn in that environment,” Karaman explains.

The virtual images can be processed by the drone at a rate of about 90 frames per second. This is around three times as fast as the human eye can see and process images. To achieve this, the team custom-built circuit boards with a powerful embedded supercomputer, along with an inertial measurement unit and a camera. They fit all this hardware into a small, 3-D-printed nylon and carbon-fiber-reinforced drone frame. 

This research was supported, in part, by U.S. Office of Naval Research, MIT Lincoln Laboratory, and the NVIDIA Corporation.