Non-invasive brain control over robotic limbs, computers, and other technology is one step closer, with a new project that allows full navigation of a Parrot AR.Drone simply by thinking about it. The research, the handiwork of a biomedical engineering team at the University of Minnesota in Minneapolis, and published this week in the Journal of Neural Engineering, pairs Parrot’s WiFi-connected quadricopter with an EEG headset that measures brain activity through the scalp. By imagining different gestures and movements, the pilot can control the drone without moving a muscle.
Brain-tracking technology isn’t new, but the best results have generally been reserved for those methods that use implanted electrodes. That requires complex and expensive surgery; however, Bin He and the team at the University of Minnesota has managed to get an EEG-based alternative system up to a similar level of accuracy.
Previous research had seen a remote-controlled helicopter steered left or right by brain signals. In He’s research, altitude control is also managed by brain commands, and the overall precision of the system is far greater: the AR.Drone can now be navigated through hoops in mid-air, for instance.
Despite the improvements, however, not everybody is up to taking the virtual reins. Initial screening – requiring moving an on-screen cursor by thinking of, for instance, a clenched fist – found some users simply weren’t able to create a sufficiently distinguishable signal for the computer to track. Those who were, were eventually put in charge of the quadricopter itself, navigating via a live stream from its nose-mounted camera.
The experience is undoubtedly slower than traditional hands-on controls, He concedes, but could make a big experiential difference to those with limited or no mobility. A combination of an intelligent drone like Parrot’s – which uses onboard gyroscopes to take on the core flying skills, such as keeping stable and level – and a WiFi-connected terminal could allow those in wheelchairs to explore areas they would otherwise be prevented from visiting.
However, the increase in granular detail pulled from the EEG headset also has potential applications in controlling localized robotic limbs. That’s a similar approach to the one taken by the MindWalker team, which has been using non-invasive brain scanning to control an exoskeleton for assisted walking.
Before the technology could become widespread, however, greater information from the headset will need to be extracted. Robotic arms, for instance, have considerable degrees of freedom, and thus demand even finer control than, say, preprogrammed steps in a walking system.