Drone Fight Simulator Available on GitHub

Microsoft has introduced an open source virtual reality toolkit for the training of autonomous drones. Part of Microsoft’s Aerial Informatics and Robotics Platform, the beta software became available on GitHub last week.

The toolkit is designed to allow developers to “teach” drones how to navigate the real world by recreating conditions such as shadows, reflections and even objects that might confuse a device’s on-board sensors.

The software allows researchers to write code for aerial robots such as drones, as well as other gadgets, and to test the devices in a highly realistic simulator. Users can collect data while testing devices before deploying them in real world scenarios or situations.

“The aspirational goal is really to build systems that can operate in the real world,” said Ashish Kapoor, the Microsoft researcher who is leading the project.

The hope is that these training tools could spur development of artificial intelligence-based gadgets that eventually could be trusted to drive cars, deliver packages, and even handle rudimentary chores in the home, added Kapoor.

 

Advanced VR

Testing in a VR environment could mean lower costs as well.

Simulators long have been used in testing scenarios, but until recently the software-based simulations lacked the accuracy of the real world and thus didn’t reflect real-world complexities. Microsoft’s system — which is based on emerging VR technologies that take advantage of advances in graphics hardware, computing power and algorithms — enables a much more realistic re-creation of a real-world environment.

Based on the latest photorealistic technologies, it can render shadows, reflections and other subtle things much better. Although humans take such things for granted, they can pose problems for computerized sensors.

Microsoft’s simulator “will help researchers to develop, debug and test their drone navigation software by enabling them to recreate a variety of operational scenarios on their desktop computers in the lab,” said Michael Braasch, professor of electrical engineering and computer science at Ohio University’s Avionics Engineering Center.

“Simulations help to reduce development costs by reducing the amount of actual flight testing required, but the catch is that the simulation must be high fidelity — that is, sufficiently realistic,” he told LinuxInsider.

“Microsoft’s simulator appears to meet this requirement for camera-based or vision sensors, but it is not yet clear if the simulator accurately depicts very small-scale obstacles such as the thin twigs at the end of tree branches,” Braasch added. “Such obstacles are nearly invisible — even with HD cameras and even at close distances. It is also unclear if Microsoft’s simulator supports non-camera-based sensors such as LIDAR and radar.”

 

Learning to Fly

Although it targets the development of autonomous drones, Microsoft’s technology could find applications with human operators as well. Consumer drones have been steady sellers in the past few years, but newbies likely experience a crash or two. Learning to fly in a simulator could solve some of the problems with learning to fly.

“First, it isn’t easy to fly a drone,” said Michael Blades, senior industry analyst at Frost & Sullivan.