Facebook Oversight Board announces first six cases for review

All six cases involve decisions originally made by Facebook to remove user content.
All six cases involve decisions originally made by Facebook to remove user content. Copyright FILE - AP Photo/Ahn Young-joon
Copyright FILE - AP Photo/Ahn Young-joon
By The Cube
Share this articleComments
Share this articleClose Button
Copy/paste the article video embed link below:Copy to clipboardCopied

All cases involve a decision where Facebook originally decided to remove user content.

ADVERTISEMENT

Facebook's Oversight Board has announced the first six cases where it will review the platform's content moderation.

All cases involve a decision where Facebook originally decided to remove user content.

These include images and posts that were accompanied by text, which Facebook says violated their rules on hate speech, nudity, and violent content.

The board said that Facebook users had submitted more than 20,000 suggested incidents for review since October.

The body is now inviting the public to anonymously comment on the six cases over the next seven days, which will be reviewed by five-member panels.

The board has not given a date for its conclusions, but Facebook has previously said it expects cases to be resolved within 90 days, including any action it is told to take.

"Once the Board has reached a decision on these cases, Facebook will be required to implement our decisions, as well as publicly respond to any additional policy recommendations that the Board makes," the authority stated.

The arbitration body was created by Facebook in response to criticism of its handling of problematic content but has itself faced backlash for its limited remit.

A group of Facebook critics, calling themselves "The Real Facebook Oversight Board" said it would hear three cases not yet eligible for official review, including a dispute about the Facebook account of Steve Bannon, a former adviser to US President Donald Trump.

"The Oversight Board is a separate body that people can appeal to if they disagree with decisions we made about their content on Facebook or Instagram," said Facebook in a press release.

"This model of independent oversight represents a new chapter in online governance, and we’re committed to implementing the board’s decisions."

"We look forward to the board’s first decisions, which should be issued in the months to come."

What are the first six cases about?

1. A screenshot of two tweets by the former Malaysian Prime Minister Mahathir Mohamad about violence against French people. Facebook removed the post for violating its policy on hate speech. While the user did not add a caption to the screenshots, they complained to the Oversight Board that they wanted to raise awareness of the former Prime Minister's "horrible words".

2. Two well-known photos of a dead child, lying fully-clothed on a beach, with text in Burmese asking why there was no retaliation against China for its treatment of Uighur Muslims, compared to the recent killings in France linked to cartoons of the Prophet Muhammad. The post, which also refers to the Syrian refugee crisis, was also removed by Facebook for violating hate speech policies. The user has claimed that their post was meant to emphasise that human lives matter more than religious ideologies.

3. Alleged historical photos of churches in Baku with accompanying text referring to "Azerbaijani aggression" and "vandalism" and stating that Baku had been built by Armenians, asking where the churches had gone. The user indicated support for Armenia in the Nagorno-Karabakh dispute but has appealed that they wanted to demonstrate the destruction of cultural and religious monuments, and says Facebook should not have removed the post under their hate speech policy.

4. Eight photographs were published on Instagram by a Brazilian user, seemingly to raise awareness of signs of breast cancer. Five of the photos included visible and uncovered female nipples, with corresponding Portuguese captions about symptoms. While Facebook took the post down for violating its policy on adult nudity and sexual activity, the user said they had shared it as part of the national "Pink October" campaign for the prevention of breast cancer.

5. A user in the United States reshared their previous post of an alleged quote by Nazi Germany's propaganda chief Joseph Goebbels, prompted by Facebook's "On This Day" function. Facebook removed the content for violating its policy on dangerous individuals and organisations, but the complainant says they consider the current US presidency to be following a fascist model and that the quote is important.

6. A video was posted in a Facebook group about France's refusal to authorise hydroxychloroquine and azithromycin as treatments for COVID-19, criticising the country's health strategy. The video was viewed approximately 50,000 times and shared around 1,000 times. Facebook had removed the video under its rules against violence and incitement, but have referred the case to the Oversight Board themselves. The company says the content highlights the challenges it faced when dealing with the risks of offline harm that can be caused by misinformation about the COVID-19 pandemic.

Share this articleComments

You might also like

Facebook sued in United States in landmark antitrust lawsuit

No, the head of the World Economic Forum is not dead

Fact-check: Is the EU digital identity wallet going to strip away our privacy?