Facebook Teen Mood Tracking Sparks Privacy Outrage

Facebook has been accused of tracking emotionally-vulnerable teenagers and opening up that data to advertisers hoping to cash in on insecure kids. The social network has been pushing its advertising platform in recent years, chasing ad revenue that might once have gone straight into Google's pockets. However, while it has access to a whole host of personal information, Facebook has found itself in hot water for how it apparently uses that data.

Advertisement

According to a document produced by Facebook's team in Australia, the social site has been tracking the ebb and flow of user mood, The Australian reports. By assessing the types of content people post at different times of the week, trends about their emotional state can be measured. The report suggests that users as young as 14 years could be analyzed in this way.

It's some of the more negative feelings among young users that have people concerned. Facebook can apparently figure out when people are feeling a host of things – including "defeated", "overwhelmed", "stressed", "anxious", "nervous", "stupid", "silly", "useless", and a "failure" – and when they're likely to impact. "Monday-Thursday is about building confidence; the weekend is for broadcasting achievements," the two Facebook executives responsible for the report explained.

Advertisement

According to The Australian, the sentiment within the report is that the emotional data could help advertisers figure out who to target and when, and with what sort of campaign. In a statement, however, Facebook argues that their allegations are mistaken. Far from trying to focus in ads on people with self-confidence, body-image, or other issues, the analysis was in fact intended to give a more broader perspective on user trends, the company says.

"The premise of the article is misleading. Facebook does not offer tools to target people based on their emotional state.

The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.

Facebook has an established process to review the research we perform. This research did not follow that process, and we are reviewing the details to correct the oversight."

Nonetheless, even if that was the intent, it's clear that something went wrong along the way. Adding to the suspicions, this isn't Facebook's first run in with research controversy. While some of its data digging is fairly innocent – like figuring out that "haha" is more popular than "lol" back in 2015 – others have provoked more complaint.

Advertisement

In 2012, for instance, the company ran tests on several hundred thousand users, measuring reactions when different emotionally-charged content was prioritized in their feed. The theory was that happier news would keep users more engaged, whereas more negative content might drive them away from Facebook. Results from the nearly 700,000 participants – who didn't know they were being experimented on – were later published in a Proceedings of the National Academy of Sciences (PNAS) paper.

One of the researchers involved with that project – and who later left the company – accused critics of misrepresenting the research though Facebook itself did take some blame. The test was "poorly communicated" then-COO Sheryl Sandberg conceded.

Recommended

Advertisement