Yahoo has inked a deal with Carnegie Mellon University to test machine-learning research, new mobile interfaces, and natural-language recognition on search and other real-time data. Dubbed Project InMind, the five year partnership is worth $10m and will see Yahoo Labs set up a new fellowship program at the university, while CMU students can dig into Yahoo’s data to see how online systems can better predict and cater for user needs and intentions.
That could include search results that are automatically personalized to the user’s interests, without them needing to specifically set out exactly what those interests are. “We hope to speed up the pace of mobile and personalization research” Dr. Ron Brachman, Yahoo Labs’ chief, said of the partnership, “and create a better user experience.”
The initial user group will be comprised of Carnegie Mellon students and faculty, and be an opt-in process, with their data fed in real-time to their research partners.
At the heart of Project InMind is what’s being dubbed the “mobile toolkit” of a living lab, which will use and refine machine-learning algorithms to predict what people are actually looking for when they search and when they pick up their mobile device. That’s been something of key interest for many companies over the past few years, as smartphones become more capable but also increase the potential to overwhelm with too much data.
So far, Google has arguably led the game on that front, tapping into contextual information with Google Now to filter out what users are likely to want to see at any one time – like imminent travel details – and offer it to them proactively, rather than waiting for them to go hunting for it.
However, Yahoo has its irons in that fire as well. The company has been making several acquisitions in the context space over the past few months, including intelligent Android homescreen replacement Aviate last month. In December, it bought NLP startup SkyPhrase, and there are rumors the company is working on its own Siri-style virtual assistant.