Meta, the parent company of Facebook and Instagram, is rolling back its COVID-19 misinformation rules in countries that have rescinded the national emergency status of the pandemic, such as the U.S.
According to various reports from The Verge(opens in a new tab) and Washington Post(opens in a new tab), this decision comes after the World Health Organization (WHO) ended its global emergency declaration on May 5. Meta’s misinformation rules will no longer be applied globally, but will be tailored by region, depending on the status of the pandemic in each area.
Meta slapped with $1.3 billion fine for sending EU user data to the U.S.
The company had previously sought the opinion of its independent oversight board, advising Meta to reassess what misinformation it removes and to increase transparency about government requests to remove COVID-19, as reported by Engadget(opens in a new tab). In response, Meta has stated that it will be “consulting with internal and external experts” promising to share details about local enforcement in “future quarterly updates.”
However, the rules will stand in countries that still have a COVID-19 public health emergency declaration, and Meta “will continue to remove content that violates our COVID misinformation policies.” The company wrote in a blog post(opens in a new tab) that it will consult with experts in order to “understand which claims and categories of misinformation could continue to pose a risk.”
Sites like Twitter and YouTube have faced immense pressure to deal with COVID-19 misinformation, including false claims about vaccines. However, Twitter stopped enforcing its own misinformation rules shortly after CEO Elon Musk bought it in November 2022. Youtube has also recently changed(opens in a new tab) its rules around election misinformation.