In 2017, fourteen-year-old Molly Russell took her own life. Her family laid part of the blame on Instagram when they discovered the teenager had viewed distressing images depicting self-harm or even suicide. Adam Mosseri, who took over Instagram in the wake of its co-founders’ departure, is promising to take action, primarily by putting a “sensitivity screen” to hide such content from view. That is until you consent to view it at your own risk and with full knowledge.
This system might be familiar to Tumblr users who have recently been greeted by such blocks over mature content. All it takes is a single click or tap to view the hidden post beneath. But unlike Tumblr’s inconsistent and unreliable mechanism, Instagram can’t afford to even the smallest things to slip through cracks.
The social network already took steps to ensure it doesn’t become a tool for harm. Even more, it rolled out features, like suicide prevention tools, to actually offer help to those who might need it. It has also removed content that contains cutting and mutilation from search results and hashtags but such images still show up. Mosseri admits Instagram is still not there and will be investing in technology that will better identify such images.
But why not remove such self-harm images completely? That would indeed be the easy way out but, unlike with porn, Instagram also wants to be a safe haven for those who are looking for healing by sharing their lives with sympathetic fellows. Or at least that’s what some experts reportedly advised the company.
Mosseri will be meeting with UK health secretary Matt Hancock to discuss the matter that is still rocking the country’s government. It isn’t known when these new measures will be put in place. More importantly, it remains to be seen if this screening, will benign, could also be a vector for potential invasions of privacy.