Training drones are meant to fly fast, even around the simplest obstacles, which turns out to be a crash-prone exercise that can have engineers who actually repair or replace vehicles with frustrating regularity.
Off late MIT engineers have indeed developed a new virtual-reality based training system for drones that enables a vehicle to “see” a rich, virtual environment while thus flying in an empty physical space.
The system, which the team has indeed dubbed “Flight Goggles,” could rather significantly reduce the number of crashes that usually drones experience in actual training oriented sessions. It can indeed serve as a virtual test bed for several of the environments and conditions in which researchers might like to train fast-flying drones.
The focus has been on a new, extreme robo-sport: competitive drone racing, in which remote-controlled drones, rather driven by human players, do attempt to out-fly each other through rather intricate maze of windows, doors, and other obstacles.
At present, training autonomous drones more of a physical task: Researchers actually fly drones in large, enclosed testing based grounds, in which they often do hang large nets to catch any careening vehicles. They also do set up props, such as windows as well as doors, through which a drone can learn to fly. When vehicles do crash, they must be repaired or replaced, which does delay development and adds to the cost of the project.
It is felt that testing drones can work for vehicles that are not really meant to fly fast, such as drones that are rather programmed to slowly map their surroundings. But as far as fast-flying vehicles are concerned there is a need to process visual information quickly as they fly through an environment, a new training system which is required.
The image-rendering system which entails the drawing up of photorealistic scenes, such as a loft apartment or a living room, and beam these virtual images to the drone as they are flying through the empty facility.
The virtual images can be indeed be processed by the drone at a rate of about 90 frames per second — around three times as fast as the human eye can see and also process images. To enable this, the team custom-built circuit board that does integrate a powerful embedded supercomputer, along with an inertial measurement unit and a camera. They fit all this hardware into a small, 3-D-printed nylon and carbon-fiber-reinforced drone frame.
The researchers do carry out a set of experiments, that include one in which the drone do learn to fly through a virtual window about twice its size. The window was rather set within a virtual living room. As the drone flew in the actual, empty testing facility, the researchers also beamed images of the living room scene, from the drone’s perspective, back to the vehicle. As the drone flew through this virtual room, the researchers tuned a navigation algorithm, thus enabling the drone to learn on the fly.
In a final test, the team did set up an actual window in the test facility, and turned on as well the drone’s onboard camera to enable it to be able to see as well as process its actual surroundings. Making use of the navigation algorithm that the researchers tuned in the virtual system, the drone, over eight flights, was indeed able to fly through the real window 119 times, thus only crashing or requiring human intervention six times.
It is understood that the virtual training system is highly malleable. For example, researchers can pipe in their own scenes or layouts in which to train drones, including detailed, drone-mapped replicas of actual buildings.
The system can also be made use of used to train drones to fly safely around humans.