Indiana Is The First State To Sue TikTok Over Child Safety Worries

In a sign of mounting legal threats for the platform, the state of Indiana has filed two lawsuits against TikTok and parent company ByteDance, one focusing on data insecurity, the other for failure to protect children from adult content. Per The New York Times, the suit is part of a larger government strategy to reduce China's impact as a competitor with the U.S. tech sector.

TikTok has repeatedly come under legal scrutiny in the United States. Originally the issue of data security tended to predominate: the Trump administration in particular took exception to the requirement for TikTok, like every company incorporated in China, to disclose to the Chinese government what would be considered private data in the United States. The Biden administration shows no sign of slackening that policy. Indeed, in November of this year, the FCC rejected several Chinese phone manufacturers from the American marketplace for exactly that reason.

As CNN reports, Indiana's lawsuit addresses the issue of TikTok's security measures, but also goes beyond it. The lawsuit seeks to prove negligence in TikTok's moderation strategy, granting underage users inappropriate access to adult content. Both issues are serious concerns for TikTok's American business going forward.

Old questions, new answers

To tech-savvy and/or historically informed readers, the widespread concern about TikTok in the U.S. might smack of earlier moral panics. As mental health nonprofit Take This reports, it's a matter of record that social media, video and tabletop games, clothing choices, music genres, and virtually anything else enjoyed by the young have been excoriated by American elders on one moral basis or another.

At the same time, serious questions have been raised about the safety of TikTok as a platform. We've reported in the past about the successes and failures of TikTok's content moderation, from its largely hands-off, algorithmic approach to managing content to the borderline unethical treatment experienced by the human moderators the platform does possess. Content capable of generating severe psychological trauma in adult professional content managers certainly shouldn't be emerging in children's feeds.

Moderation and data security are also inescapably entwined. Hands-off moderation doesn't just threaten the possibility of traumatic content in users' feeds; it allows for sharing media at least some users are likely to see as unethical if not illegal. Add that to the documented pressures that Chinese law puts on social media platforms and it starts to seem like the Indiana lawsuit, right or wrong, at least has some kind of grounding.

Still, TikTok has answered critics and survived plenty of tough talk from the previous presidential administration. Whether it can continue to do so will depend both on the commitment of the platform's user base and its ability to adapt to the requirements of American law.