Forget Embarrassment, I'd Wear Google's AR Glasses

I'm a geek, an early-adopter and a lover of science-fiction; I also have relatively little shame: of course I'm the ideal target audience for Google Glasses. If the rumors are to be believed, Google's wily engineers have used their "20-percent time" to cook up some Android-powered digital goggles, overlaying augmented reality data onto the real-world view. The first generation is likely to be oversized and expensive, but I'll still probably buy them anyway and wear them with pride. Here's why, and what I think Google needs to do if its Google Glasses are to succeed.

Picture the scene: I'm at an glamorous MWC industry garden party. The team from The Verge are sitting in a clandestine huddle, muttering about a brave new world of blogging. The Gizmodo team are loitering near the bar tent, keeping an eye out for any fallen phones that they can return to Lost Property. The rest of the tech community accidentally got into a discussion about what presented more of a threat to online journalism, Google Panda or Justin Bieber, and we're about three margaritas away from a full-on fist fight.

Problem is, a lot of these people I've never met in person. Sure, we've traded links on Twitter, and they might be in my Google+ circles or even on my friends list on Facebook, but it can be difficult matching real-life face to profile picture. I could be stood next to someone I "talk" with online several times a day, without even realizing it. Sure, I could just introduce myself, but the geeky way forward would surely be to have my wearable computer identify each person so that I don't look like entirely socially-inept.

The well-rumored Google Glasses project sounds like it might tick some of those boxes. According to the industry chatter, Google is basically fitting an Android smartphone into a pair of chunky eyewear, with discrete integrated displays in the lenses, wireless connectivity, gesture-based control and various augmented reality apps. Something a lot more discrete – and cool – than Kopin's Golden-i, as I made a fool of myself wearing back at MWC 2010.

[aquote]AR glasses aren't exactly a new concept[/aquote]

The strange thing is, I started writing about my geek-glasses dreams before news of Google Glasses broke. In fact, I've wanted something similar for years; I mean, AR glasses aren't exactly a new concept – just go to the William Gibson section of your bookshelves (I'm assuming you have one, right?). It sounds like Google is on the right track, especially if the search company can stick to the rumored sub-$600 price tag, but it's not quite there yet. Here's what I'm thinking...

The core of the system has to be the displays. Google is tipped to be creating a 960 x 540 floating virtual screen using a pair of micro-LCD projectors, one in each side of the frame. They'll project onto a pair of angled panels in the lenses themselves. We've seen something similar from Lumus at CES last month, though it was delivering 720p 1280 x 720 resolution instead. The higher the resolution the better, obviously – especially when you're looking at the virtual equivalent of an 87-inch panel – though there are inevitable compromises to be made with how chunky your eyewear then becomes. Still, I wear reasonably retro black chunky plastic frames now, so I may as well go the whole hog.

Lumus DK-32 Wearable Display:

Control shouldn't make me feel too embarrassed, and I'm not sure if Google's hand-tracking gestures are the best way of handling it. Waving my arms around in the street is likely to get me the wrong sort of attention. Better, perhaps to use eye-tracking and voice recognition, maybe an ad-hoc link with my smartphone to turn it into a more precise trackpad as well as allow for more lengthy text-entry. Similarly, if I'm in front of a computer or TV, the option to temporarily pair up and use whatever full-sized peripherals are to hand would make sense. Bluetooth 4.0 promises low-power and just that sort of impromptu connectivity.

Software is also important, especially if I want my geek glasses to recognize people for me. That shouldn't be too difficult, although facial recognition can always be touch-and-go. Still, Canon managed to equip a sub-$300 point and shoot camera with face-recognition, running entirely locally, so factor in the power of Google's cloud servers (and its ability to pull in comparison photos from Google+, Facebook, Twitter, your Flickr and Picasa accounts, etc.) and you have a huge amount of data and the processing grunt to crunch through it. Scalado's "intelligent picture" vision could be an interesting addition too:

Scalado Vision concept:

Reliance on the cloud means a persistent connection is necessary, preferably high-speed if I'm not going to hang around with my hand outstretched saying "nice to see you, Mr ...." 4G is the obvious answer, though that comes with battery implications; in fact, I'm thinking that could be Google's biggest challenge in wearables. The company doesn't apparently expect most people to wear and use the glasses all the time, instead dipping in and out of AR periodically, much in the manner that we use our current smartphones, but I think that once people get a glimpse of what's possible they're not going to want to take them off.

So, that means easy recharging is needed, and inductive wireless power is probably the easiest method. My regular glasses sit on my nightstand doing nothing while I sleep, the lazy things; how much better if they recharged and even pre-cached data contextually selected based on what I'm likely to be doing the following day. Google's mashup possibilities with our Gmail, Calendar, Google+ and other data are the topic of privacy watchdogs' nightmares, but I'd be happy for the search company to check the overlaps if it cut down on how much bandwidth I might need for my wearable computer the next day.

[aquote]In the end, context is going to be most important for any type of wearable electronics[/aquote]

In the end, it's context that's going to be most important for any type of wearable electronics. The safety implications of having a floating display in front of you all the time are obvious – we're bad enough when periodically glancing at a phone – and there are occasions when you simply don't want to be bombarded with the latest tweets, Facebook pokes, emails and RSS bleating. The more personal our computing gets, the better the filter needs to be. I want the most important messages, information and alerts, not everything possibly relevant crowding into my perspective.

That's what I'll be most curious to see if Google's engineers have achieved: not the hardware, but if the interface and functionality has taken into account the change in how people will use, interact and rely upon it. An Android phone you carry on your face isn't special. The next-generation of personal digital assistant definitely could be.