Facebook seems to have something like a reverse Midas touch where it infects anything it buys with a certain curse. Social networks that were thriving before they became Facebook properties have been transformed into questionable businesses that rely on violating privacy to make a profit. That even includes WhatsApp, whose claim to fame has always been its end-to-end encryption and privacy guarantee. Now it turns out that there might be a small loophole that gives Facebook a bit of wiggle room to actually break that trust, usually under the guise of observing the law.
End-to-end encryption means that messages are practically scrambled until they reach the intended recipient, and that’s how WhatsApp works in general. There is, however, one very specific instance where that doesn’t apply. When a user flags a message as potentially improper, all bets are off, and WhatsApp moderators get a clear and unscrambled view of that message and other things related to it.
WhatsApp doesn’t actually call these more than 1,000 contract workers “moderators,” preferring the term “content reviewers.” Unlike moderators who can delete threads or specific messages, these reviewers can only do three things: ignore the report, place the reported account under watch, or ban the account immediately. Just like moderators, however, they have a clear view of the allegedly offending message, four messages before that message, and metadata relating to users in that conversation.
Unfortuntely, whistleblowers revealed to Pro Publica that WhatsApp’s guidelines for judging reported messages are confusing, arcane, and even disturbing. It doesn’t help that messages can come in all kinds of languages, and Facebook’s automatic translation tool sometimes erroneously labels languages. Mistranslations and misunderstandings have caused companies selling shaving razors to be flagged for selling weapons, and those selling bras are labeled as a sexually oriented business.
While regular, unflagged messages remain strongly encrypted, metadata about WhatsApp users aren’t, and Facebook can readily provide law enforcers with these when requested. There’s also the worrying push from Big Tech to develop AI that can glean some information even from encrypted messages. In the final analysis, even a platform or service that technically does offer strong security guarantees still works on a system of trust, and Facebook’s ownership of WhatsApp may not exactly engender that.