Air Force photo

Any pilot will tell you that flying is the easy part, it’s landing that’s hard. That adage is especially true for robotic planes like the Predator and Reaper drones. While the UAVs can follow a pre-programmed flight path, they still need a human to bring them safely down to the tarmac. And that means a lot of UAVs crashing due to human error.

Well, according to Danger Room (friends of the show), that’s about to change. Image analysis company 2d3 is developing new software that will finally allow the UAVs to land themselves.

The program, called The Visually Assisted Landing System (VALS), lets the drones use their cameras to identify landmarks, adjust speed and direction accordingly, and navigate to a smooth landing. And since runways are clearly defined, flat, obvious pieces of topography, identifying them should be easier than, say, distinguishing between some Taliban in a Toyota pick up and a bus full of deaf nuns.

Navigating three dimensional space is a specialty of 2d3, as their flagship product boujou creates computerized 3-D environments based on footage shot by movie cameras. A short animated video of how the system works can be seen at the 2d3 website.

The best part about VALS is the weight. Unlike some other automated landing systems, VALS is only a software upgrade, not a bulky physical addition to the aircraft. By utilizing the drones existing cameras, the system can be used for both the larger UAVs like the Predator, and smaller drones like the Scan Eagle.

With take off and landing becoming fully automated, that’s one less thing for human pilots to do. But as long as no one designs a robot that plays beach vollyball, there will still be something only a flesh and blood fly boy can do.

[via Danger Room]