Facebook working on “deep learning” artificial intelligence to improve news feed

Sep 20, 2013
4

Facebook is aiming to get inside its users' minds, figuratively speaking, to find out what they mean when posting statuses. Such is coming by way of an artificial intelligence being created by the social network that MIT reports as functioning with simulated networks of brain cells, all of them crunching away at data to perform "deep learning," helping it to ascertain the emotions and subtleties behind a status.

The goal behind what is being referred to as the deep learning system is the ability to analyze a status and figure out what the events or likely emotions were behind the words, though neither may be specified. The artificial intelligence could also learn to pinpoint what objects are in photographs, and learn to predict what users may do in the future.

Such information comes from Facebook's Chief Technology Officer Mike Schroepfer, who says the project is being worked on by eight individuals who are collectively known as the AI Team. The deep learning project is new, and the social network is remaining tight-lipped about most of its details, not surprisingly. But some of the ends goals have been revealed, among them being a boost to users' news feeds.

In particular, Facebook aims to improve what status updates Facebook users see in their news feed, selecting the ones amidst the barrage that happens every day that each user is mostly likely to want to see. The company already does this using more conventional means, attempting to provide status updates from friends that each user would likely be most interested in, a necessity given the vast amount posted every day.

Said Schroepfer, "The data set is increasing in size, people are getting more friends, and with the advent of mobile, people are online more frequently. It's not that I look at my news feed once at the end of the day; I constantly pull out my phone while I'm waiting for my friend, or I'm at the coffee shop. We have five minutes to really delight you."

SOURCE: MIT Technology Review


Must Read Bits & Bytes