Intel's Research Labs often throw up an interesting project or two, and given how CES 2010 seemed in-part obsessed with network-connected gadgets that you'd carry with you pretty much all of the time, their Everyday Sensing and Perception group seems particularly topical. The team are looking at context-aware computing; that is, a system that can combine location (both in geographic and symbolic, i.e. clothes store), activity (both physical and object-based, i.e. watching TV) and social interaction (who are they talking with and in what role) data and "infer a users context with 90-percent accuracy over 90-percent of their day."
To do that, Intel reckon you'll need a whole load of sensors, and happily they've got teams working on those too. That includes FID tags, accelerometers and radios, together with video cameras and microphones, the data from which gets intelligently processed; for instance, their Federated Perception project can combine the input from two different cameras to better recognize objects and build system understanding of how complex environments are interlinked.
As for consuming the data such sensing/perception systems would output, Intel are looking into using projectors - possibly like the wearable "sixth sense" project - together with haptic feedback and other non-traditional ways mobile devices could communicate with their users. Nothing headed for shelves any time soon, but interesting reading all the same; we guess this sort of stuff is what Intel mean when they talk about industry-leading product innovation.