Meta Is The Latest Platform To Scale Back Its COVID-19 Misinformation Policy

A month after the World Health Organization declared COVID-19 no longer a global emergency, Meta has announced that it will be rolling back its misinformation policy concerning the virus. The new rules will apply to both Instagram and Facebook in countries that no longer officially deem the COVID-19 pandemic, which has been ongoing since early 2020, a national emergency. This includes the United States, after President Joe Biden signed a law in April that ended the emergency and related government practices, like providing free COVID-19 tests and forbearing mortgage and student loan payments.

Meta's content policy, which included removing "COVID-19 content that contributes to the risk of real-world harm, including through our policies prohibiting coordination of harm, peer-to- peer sale of test kits and related goods, hate speech, bullying and harassment," was intended to combat the alarming amount of misinformation and disinformation that spread since the start of the pandemic. Between March 2020 and July 2022, Meta took down 27 million posts containing COVID-19 misinformation. 

The change in rules comes after Meta's Oversight Board suggested the company prepare for changes after the WHO's decision to end the emergency. In a blog post, Meta says that, moving forward, it will continue to consult with health experts in a "more tailored approach to our COVID-19 misinformation rules."

Other platforms have already changed their policies

In order to limit the spread of COVID-19 and mitigate serious health effects and deaths, Meta joined with other major platforms at the height of the pandemic to slow the dissemination of dangerous misinformation and disinformation. This coincided with the national spotlight on the tech industry's role in communication and public opinion concerning other societal issues, including election coverage and child welfare, pressuring companies to take a more active role in protecting users.

As the COVID-19 pandemic worsened and misinformation proliferated, including inaccurate claims about vaccines and causes of the virus, Meta joined platforms like Twitter and YouTube to create official rules on what could and could not be posted. However, as COVID-19 cases and deaths across the globe declined, Twitter stopped enforcing its policy late last year. With national governments and the WHO now taking a more lax approach to the pandemic, Meta and other platforms also seem ready to move on.

While COVID-19 cases are lower than they were two years ago, thousands of Americans testing positive for the disease are admitted to the hospital each day, with the elderly still especially vulnerable despite being vaccinated. More concentrated outbreaks are also possible in the future, so Meta and other platforms' policies may again be amended down the line. In the meantime, Meta's misinformation policy will remain in place in areas where COVID-19 is still an official public health emergency.