Facebook has announced it will step up efforts to tackle more false information about coronavirus vaccines.
The tech giant pledged on Monday to do more to "remove the false claims on Facebook and Instagram about Covid-19 and vaccines in general during the pandemic."
Facebook stated they will provide more information on national campaigns and stronger measures against false rumours spread by anti-vaccine groups.
The company had already banned vaccine misinformation from its advertising policies.
False messages that COVID-19 was made by humans, or that vaccines are toxic or cause autism will be removed, Facebook said, and could result in users being banned.
Accounts of Instagram users who seek to discourage their subscribers from getting vaccinated will be harder to find, the statement added.
But critics say that Facebook has "repeatedly promised to crack down on misinformation" over the last 12 months and fallen short.
"Each time, they fail to achieve their goals," tweeted the Centre for Countering Digital Hate, noting that Facebook previously removed vaccine misinformation rarely and only when it was considered to risk "imminent harm".
In December, the social network had announced it would remove false claims about COVID-19 vaccines that had been debunked by public health experts, but many accounts, pages, and groups were still spreading false claims.
Pressure grew on social media platforms to take action after the UK became the first country in the world to authorise the Pfizer/BioNTech vaccine.
Health authorities across Europe have expressed their concerns that skepticism or hesitancy over COVID-19 vaccines will be influenced by online misinformation.
Hans Kluge, director of WHO Europe, said: "I urge you to seek reliable information from trustworthy sources. Don’t be part of a misinformation infodemic. Vaccination saves lives; fear endangers them."
Facebook said they would also be investing $100 million (€82 million) in the news industry and supporting fact-checkers.
"We have published... papers on super-spreaders of misinformation on Facebook or Twitter and a lot of hoaxes were allowed to circulate on these platforms," Chine Labbe, Europe managing editor of NewsGuard, a browser extension that rates news websites according to their trustworthiness.
"It's good that Facebook is talking about it right now but until now they have proven they were not ready to tackle this problem."
The tech giant is also expected to publish the results of a major study on the pandemic, which collected 50 million responses from citizens and users.