MIT, Yale create gesture-controlled drones

When it comes to automating aircraft controls, there has been a significant amount of progress in the fields of science and military. It's possible for an unmanned aircraft to take off, to fly, and to touch down. But when it comes to actually moving on the ground, humans still need to have direct control. That may not be the case for much longer.

When taxiing on the ground, military personnel use a very strict set of hand gestures to tell the pilot where to move and various other commands like openin weapon bays or cutting their engines. Scientists at the Massachusetts Institute of Technology and Yale studied these universally accepted hand gestures and tried to see if a computer could recognize them without error.

The scientists created a program that can record body, arm, wrist, and hand/finger positions. During the earliest trials, it was able to recognize the correct command 76% of the time. That obviously wouldn't cut it in any sort of real-world application, but it's a pretty darn good starting point. Ultimately, military officials and researchers believe it will one day be common practice to turn the human process of signaling and interpreting hand gestures into a computer-controlled process.

[via New Scientist]