A software company CEO who is trying to train drones to think like pilots promises he is not producing a cadre of mutinous rebel aircraft. He just wants to prevent collisions between drones and human-powered airplanes.
Last summer, we told you about an Air Force program that seeks algorithms that would allow drones to predict other airplanes’ intent. This is ostensibly to avert disaster — if a drone can anticipate a pilot’s next maneuver, it will know to get out of the way.
It seems logical — think of driving on the highway, and how you can sometimes just feel when the guy in the SUV next to you is about to merge into your lane.
Dick Stottler’s company, Stottler Henke Associates, is working on a software package that can help drones learn how to anticipate other aircraft’s actions. The firm recently won a $100,000 Air Force contract to develop its Intelligent Pilot Intent Analysis System, which models pilots’ behavior in real and predicted scenarios, according to Danger Room.
The program models how piloted airplanes take off, maneuver and land, incorporating information from air traffic control and runway-specific data, Danger Room reports. The information will help the drones build human pilot profiles, telling them how to react when two planes get too close. The algorithm can even handle a damaged plane or a troubled aircraft, whose behavior might deviate from normal models.
But Stottler acknowledges the algorithm can’t predict errant pilot behavior. What about errant drone behavior? Is Stottler teaching drones to disobey their human masters?
“No, I am not,” Stottler promises.
It is not clear whether the aircraft will become aware and figure this out for themselves. If so, at least we can rely upon the rogue drone corps.