Landing airplanes on moving ships is no mean feat, but this will be especially true when the airplanes are unmanned. Along with making decisions, autonomous airplanes will have to heed their human counterparts during aircraft carrier takeoff and landing — but can a robot read and understand arm-waving signals?
The problem is complicated in at least two ways — first, the airplane must determine whether the human’s hands are up or down, elbows in our out. Second, it has to red which gesture the human is making, and what it means. MIT PhD student Yale Song is trying to solve these problems.
Song and fellow scientists recorded various people performing a set of 24 gestures aircraft carrier deck personnel use, including arm waving and folding, and hand movements. They built software that determined each person’s elbow, wrist and hand positions, including whether palms were open and whether thumbs were up or down. They completed that portion of research last year. Then, the team had to classify all these gestures according to their meanings. But this is complicated, because deck signals are a complex ballet of movement — it’s not like a seaman makes one motion and then stops for a beat before starting another. So the algorithm has to determine a gesture’s meaning without a clear beginning, middle and end.

The team accomplished this by breaking down the videos into clusters of frames, in which the analysis overlaps. As MIT News puts it: “The second sequence might start at, say, frame 10 of the first sequence, the third sequence at frame 10 of the second, and so on.” This way, the algorithm can calculate the probability that a given sequence belongs to one of the 24 gestures in the catalog. The researchers tested it and the algorithm correctly identified the hand signal sequences 76 percent of the time.
Song believes this can be improved by tweaking the computations, according to MIT. The work appears in the journal ACM Transactions on Interactive Intelligent Systems.
[MIT News]
140 years of Popular Science at your fingertips.
Each issue has been completely reimagined for your iPad. See our amazing new vision for magazines that goes far beyond the printed page
Stay up to date on the latest news of the future of science and technology from your iPhone or Android phone with full articles, images and offline viewing
Featuring every article from the magazine and website, plus links from around the Web. Also see our PopSci DIY feed
Engineers are racing to build robots that can take the place of rescuers. That story, plus a city that storms can't break and how having fun could lead to breakthrough science.
Also! A leech detective, the solution to America's train-crash problems, the world's fastest baby carriage, and more.


Online Content Director: Suzanne LaBarre | Email
Senior Editor: Paul Adams | Email
Associate Editor: Dan Nosowitz | Email
Contributing Writers:
Clay Dillow | Email
Rebecca Boyle | Email
Colin Lecher | Email
Emily Elert | Email
Intern:
Shaunacy Ferro | Email
Why is this necessary? Why can't the people coordinating things just drive it with a joystick?
This is designed to simplify the process. Using hand signals that have already been developed and having each have specific purpose and meaning, why would you want to have to use a joystick and issue the commands in a different format?
Furthermore, I believe they are taking a step forward and looking to the future. If reliable software can be created for this application, what is to stop it from being used in others. Whether or not this will catch on or will succeed with such efficiency is yet to be seen.
The hand signals are there because the pilot in the aircraft isn't going to be watching everything happening on the flight deck, and trust me, it can change at a second's notice.
The pilot is watching the PC (Plane Captain) for commands on where to move the aircraft, how fast to move, whether the wings need to be folded or not, if he should drop hook or do a flight controls test and so many other things. Even something like turning on the engine, since the pilot can't see under the aircraft
If the UAV/robots can take those same commands, it means that the PCs don't need to relearn all their command signs, and the UAV can integrate into the flight deck's movements smoothly.
Proud Sailor of the USN
Alas, the lack of an edit button bites me again. I would like to see how the system would handle a hand off from on PC/Director to another.
Proud Sailor of the USN
wouldnt kinect be easier to work with?
Sounds very complicated, imagine if a strong gust of wind blows and the Aircraft Conductor needs to lurch to brace himself, sometimes these guys drop to a knee, would that cause the plane to lurch along with his movement? Vice versa with the plane movement. Pretty complex!
Innovation with Nintendo Wii technology. Just be sure not to sneeze.
A computer is not a human being. Tasks that are very simple for us are very difficult for a computer. Especially if the possibility exists for ambiguity. Eventually a computer could be created to handle such, but I think it would be easier for humans to adapt to the robots and not the other way around. There will be unforeseen situations where the computer will misinterpret the signals or situation. Computer don't have common sense and every possibility has to be programmed in. That is difficult to predict.
a buck says this dude's movin to china after he's finished with his degree...
so any hacker with a Kinect (read: anyone that makes the Kinect worthit) will be able to control these UAVs? great. as if it wasn't scary enough that we already had drones in our skies.