Facebook mood test outrage is FUD claims ex-researcher

Chris Davies - Jul 5, 2014, 7:00pm CDT
Facebook mood test outrage is FUD claims ex-researcher

One of the former Facebook data scientists at the heart of the recent controversy over mood manipulation and tests run on unwitting users has spoken out, claiming his quotes were taken out of context, and defending the social network’s experimentation. Data scientist Andrew Ledvina, who left Facebook in April, was one of the originally quoted sources when the psychological research carried out by the site surfaced late last month, used to illustrate how Facebook lacked safe review processes for tests performed on its users. Now, Ledvina says that the reporter he spoke to mis-represented the facts.

The original claims surrounded a piece of research Facebook undertook to see how users would react to different mixtures of stories in their curated News Feed. One of the theories, it was suggested, was that showing users a greater proportion of positive stories might increase engagement, while conversely more negative topics could reduce how often that person visited Facebook.

Ledvina’s comments were used to show that Facebook lacked the sort of rigorous, formal review policy that, say, a university psychology department might have, and which would examine proposed research to ensure it was ethically sound and not present a risk to users whether they were aware of it or not. Employees were “desensitized” to the fact that they were dealing with hundreds of thousands of real people, he was quoted as conceding.

According to Ledvina, however, the way the testing has been portrayed – and his comments on it – aren’t actually illustrative of how he feels or how things are handled within Facebook. In a post on his own blog, he explains that while he was a data scientist on the Analytics team, only a small subset of that – known as “Data Science”, and numbering around ten while he was there – actually ever did social science research.

“There are only a small subset of people on the data science team who have social science research as part of their primary job duties. Everyone else would not run an experiment like the one people are talking about because it would not help the product they are working on. Those who do run such experiments, have very high ethical standards and experience running experiments both inside and outside academia” Andrew Ledvina

Ledvina says he spoke to WSJ reporter Reed Albergotti on the topic for approximately thirty minutes, having previously spoken to him regarding an earlier, different story concerning another Facebook issue. The quotes in this most recent piece, however, were “twisted around” or cobbled together from multiple sentences so as, he alleges, to flip their tone and nuance. Albergotti, Levdina argues, wanted to portray Facebook negatively, and so edited to suit that goal.

As the data scientist points out, the research in question is now more than two years old, and users effectively agree to experimentation when they sign up to Facebook in the first place. He also draws parallels between such experiments and what advertising tries to do, namely alter mood and behavior.

If someone within Facebook wished to trial different colors or sizes of buttons, or an alternative ad targeting system, or even how selections of “top stories” for the News Feed were made, they didn’t need approval for that first, Ledvina says of his time at the company. That clearly changed if they later wanted to publish a paper claiming the results as a scientific experiment, of course, he adds, requiring exploit permission.

Ironically, Ledvina concludes, the recent furore is unlikely to change what tests go on within Facebook, only whether the results get published publicly, thus depriving social psychology research – among other fields – of a huge sample size that could significantly contribute to human understanding.

Since news of the study broke, Facebook’s COO Sheryl Sandberg has spoken publicly about it, suggesting that it was “poorly communicated” rather than ethically unsound. Nonetheless, Facebook now finds itself being investigated by the UK’s Information Commissioner’s Office over potential privacy impacts, among other agencies.

SOURCE Andrew Ledvina

Must Read Bits & Bytes