A strategy set in place to stop the transmission of false information about COVID-19 on Facebook and Instagram has been discontinued globally.
This was contained in a statement from Meta Platforms announced on Friday.
Social media sites like Facebook and Twitter came under a lot of pressure to address pandemic-related misinformation, such as false claims regarding vaccines, and as a result, they implemented strict controls.
Meta, according to Kenyan authorities, is also fighting on a different front in Kenya where content moderators, who are in charge of vetting violent and offensive messages, have filed a lawsuit against the social media behemoth for breaking data protection regulations.
Earlier in 2021, Facebook reported that between October and December, it erased 1.3 billion phoney accounts and more than 12 million pieces of content on COVID-19 and vaccinations that international health experts had deemed to be inaccurate.
Given the improvement in reliable information sources and increased public knowledge of COVID, the Facebook parent company asked its independent oversight board for advice in July of last year on potential adjustments to its existing strategy.
In contrast, Meta asserted on Friday that the guidelines would remain in effect in nations where a COVID-19 public health emergency declaration is still in effect and that the firm would keep removing content that is in violation of its coronavirus misinformation policy.