Siri obeys real-life mind control

This week over at Honda R&D Americas, senior iOS architect Duane Cash is showing off a new brainwave control device and custom Siri API working with the iPhone in the real world. The initial test shows the man using a wave of his hand and the EEG signals his brain generates to start up the app and activate several functions on the device. This isn't the sort of thing you're going to be able to run out and buy right this minute, but given the relative simplicity of the build – so to speak – we might be seeing something like this from 3rd party developers (namely Honda R&D Americas) bringing such solutions soon.

What you're seeing here is the iPhone displaying a map, opening a menu, and closing a menu. This is done with the read EEG signals coming from Cash's brain interpreted by a custom developer code made for Siri on the iPhone. With this setup, we can potentially run anything Siri can run by simply thinking of it.

Of course it's not all that simple, and just getting this far in the project was no cake walk. With mind control used in this manner we're still quite a few years away from creating anything that wouldn't be simpler with simple taps or voice commands. In the future though, we might be thinking of novels and creating masterpieces in media without lifting a finger. We shall see!

ALSO NOTE: Though Cash works for Honda R&D now, he may be doing additional work on this project as his own independent masters project. Take a look at his LinkedIn profile and see what you make of it.

Then have a peek at a set of recent posts surrounding Siri and user attempts to make this ultra-popular app interface part of the greater control-all universe. With Apple's implementation of Siri on their iPhone and iPad line, the company has over the past couple of years changed the way we see smart device control – now we're at a point where it's no longer easy enough to speak commands. We just want to think them.

[via Andrew Lim]