Facebook said this week it had removed over three dozen pages that were spreading online misinformation about COVID-19 vaccines.
Facebook has been accused of shifting the blame about tackling COVID-19 vaccine misinformation.
This week, the social network said it had removed dozens of pages linked to "superspreaders" of anti-vaccine content.
A March report by the Center for Countering Digital Hate (CCDH) and Anti-Vax Watch found that just twelve anti-vax accounts were responsible for spreading around 65% of online vaccine misinformation.
Although Facebook took action against the accounts, it has disputed the methodology behind the report and its findings.
But the CCDH has accused Facebook of "grossly misrepresenting" the research and refusing to take accountability for dealing with misinformation on their platform.
Other social networks, such as Twitter and YouTube have also come under fire for the spread of vaccine misinformation worldwide.
US President Joe Biden has even said that platforms like Facebook are "killing" people by allowing false information about vaccinations to circulate online.
Facebook has said that it would continue to promote reliable information about vaccines to users.
"We're continuing to work with external experts and governments to make sure that we are approaching these issues in the right way and making adjustments if necessary," the blog post added.
The "disinformation dozen"
According to the CCDH report, twelve individuals had shared up to 73% of the anti-vax content that existed on Facebook in February and March.
The so-called "disinformation dozen" included US lawyer Robert F. Kennedy Jr. and US physician Joseph Mercola.
In the wake of the report, Facebook representatives testified in Congress about the scale of misinformation about vaccines on their platform.
And on Wednesday, the social network confirmed in a blog post that it had taken action on a number of accounts linked to the twelve prominent anti-vax voices.
The company said that over three dozen pages, groups, and accounts on both Facebook and Instagram had been removed.
Further "penalties" were also imposed on nearly two dozen additional accounts within the same network, where their content will not be recommended to users and will be less seen in Facebook newsfeeds.
Facebook said that the remaining content that still exists on the platform and is linked to the network does not break their rules.
"Facebook has refused to take real accountability"
Facebook stated that reports on vaccine misinformation had created a "faulty narrative" that twelve people were responsible for most of the vaccine misinformation online.
"There isn’t any evidence to support this claim," Facebook said in its blog post.
"Moreover, focusing on such a small group of people distracts from the complex challenges we all face in addressing misinformation about COVID-19 vaccines."
The social network has said that the 12 accounts named were responsible for "just 0.05%" of all views of vaccine-related content on the platform.
"That said, any amount of COVID-19 vaccine misinformation that violates our policies is too much by our standards," the company added.
"Since the beginning of the pandemic across our entire platform, we have removed over 3,000 accounts, pages, and groups for repeatedly violating our rules against spreading COVID-19 and vaccine misinformation and removed more than 20 million pieces of content for breaking these rules."
But the Chief Executive of CCDH, Imran Ahmed, has said Facebook's response to the study has shown a lack of accountability.
"This blog post is merely the continuation of their strategy to deny, deflect and delay," Ahmed said in a statement on Twitter.
"Facebook has refused to take real accountability and continually resists calls for transparency [on how they moderate content]," he added.
"Facebook has an obligation to turn over their data immediately in the interest of public health."