Meta Platforms said Friday that a policy that was put in place to curb the spread of COVID-19-related misinformation on Facebook and Instagram would no longer be in effect globally.
Social media platforms like Facebook and Twitter came under enormous pressure to address pandemic-related misinformation, including false claims about vaccines, prompting them to crack down.
In early 2021, Facebook said it removed 1.3 billion fake accounts between October and December and removed more than 12 million pieces of content about COVID-19 and vaccines that global health experts flagged as misinformation.
In July of last year, Facebook’s parent sought input from its independent oversight board on changes to its current approach, given the improvement in authentic sources of information and general awareness of COVID.
However, Meta said on Friday that the rules would remain in effect in countries that still have a public health emergency declaration for COVID-19, and that the company would continue to remove content that violates its coronavirus misinformation policies.
“We are consulting with health experts to understand which claims and categories of misinformation might continue to present this risk,” Meta said in a blog post.
In early November, Twitter also reversed its COVID-19 misinformation policy.
In another recent development, Meta, which owns Facebook, Instagram and WhatsApp, has launched a verified service in India at a monthly subscription price of Rs. 699 for mobile apps, the company said Wednesday. Meta plans to roll out a verified service on the web in the coming months at a subscription price of Rs. 599 per month.