Monitoring fake news was never a priority, says ex-Facebook worker

Image: Sarah Katz
Sarah Katz, who worked under contract as a Facebook spam analyst and content monitor from March 2016 to that October, speaks in an exclusive on-camera interview with NBC News. Copyright NBC News
By Jo Ling Kent and Chiara Sottile and Alyssa Newcomb with NBC News
Share this articleComments
Share this articleClose Button

Facebook's insistence that it need not behave like a media company is what led to the spread of fake news, says one ex-Facebook contractor.

SAN FRANCISCO — Facebook's team of content reviewers focused mainly on violence and pornography, making it "incredibly easy" for Russian trolls to fly under the radar with their fake news, according to a former Facebook content monitor who worked at the social network for eight months in the lead-up to the 2016 U.S. presidential election.

"As long as these stories didn't contain anything pornographic or incredibly violent, they [would] be left alone, because Facebook has billions of users and there's no way that the content moderators are going to peruse every single news article for valid facts," Sarah Katz, who worked under contract as a Facebook spam analyst and content monitor from March 2016 to that October, said in an exclusive on-camera interview with NBC News.

Facebook contractors are charged with weeding through the social network's queue of reported content and are trusted to make a decision in seconds about whether something violates the rules. Katz said she reviewed an average of 8,000 posts a day, spending less than 10 seconds per ticket.

There is no set quota for pieces of content a monitor must review in a day, according to a person familiar with the matter, who added that determinations on some types of content can be made more quickly than others.

"We were given about a five-page packet on our first day," Katz said, that "basically outlined examples of what counts as spam and what doesn't."

"To sum it up, what counts as spam is anything that involves full nudity."

The bulk of what Katz saw on a daily basis, she said, had been flagged by users and sent to a queue for Facebook's human moderators to review. In the months before the 2016 election, before the term "fake news" became ubiquitous, if a user wasn't flagging a piece of content, it would likely continue to exist on Facebook.

Once a person creates a Facebook account, they are free to create and spread fake news, Katz said. "It's not like you even have to be a computer whiz," she said.

With more than two billion users, there is just too much content to keep track of, said Katz. "Especially when they have guidelines that specifically pertain to illicit graphic material," she told NBC News. "So the more considered benign content, such as news stories, are definitely not analyzed perhaps as closely as they need to be."

Facebook's insistence that it need not behave like a media company led to fake news, said one ex-Facebook contractor.

In a separate interview with NBC News, Monika Bickert, Facebook's Head of Global Policy Management, responded to Katz's claims in part by saying, "The important thing to know is that if you are a reviewer for Facebook, you're not responsible for just enforcing one policy, you are enforcing our community standards across the board from bullying and harassment to pornography to spam to threats of violence, all of it."

Katz said she regrets having been part of a team that "didn't have the foresight to expect this type of ploy to occur."

"I'd say that our social platforms, in particular, have to tread carefully," she said. "And I feel like we didn't tread carefully enough."

Related: Facebook is a 'living crime scene,' says one tech insider

Bickert told NBC News that Facebook has "made significant strides since the 2016 election" and has been investing in technology to get better at detecting fraudulent accounts.

"We're committed to getting this right. We know we didn't get it right in the 2016 election, and we are going to get it right next time," she said.

Last May, Facebook founder and CEO Mark Zuckerberg said he would increase the number of human reviewers. Bickert confirmed to NBC News that the overall operations team now includes a total of 10,000 people specifically tasked with tackling the abuse problem. "We have 10,000 people right now that are working on safety and security issues across the company. And that's going to be up to 20,000 by the end of 2018," she said.

"We know we didn't get it right in the 2016 election, and we are going to get it right next time," said a spokesperson for Facebook.

"These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation," Zuckerberg wrote. "And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they're about to harm themselves, or because they're in danger from someone else."

Human reviewers can be an asset when it comes to helping someone who wants to harm themselves on Facebook Live, but when it comes to combatting fake news, Facebook has been careful to draw the line.

The social network has argued that since it is not a media company, it should not have to validate informational content. Instead, it has turned to other measures to help thwart fake news, including asking users to flag stories and sending suspicious content to third-party fact checkers.

Last August, Facebook announced it would ban pages that post hoax stories from advertising on the social network. And last week, following an announcement from Zuckerberg that his 2018 personal challenge is to "fix" Facebook, the company outlined a plan to refocus its news feed. The new home page scrolling experience for Facebook users will now include more meaningful posts from friends and family, and far less from brands and publishers.

But critics worry about whether it will be enough to stop Russian meddling in the midterm elections in November.

Ultimately, it is Facebook's insistence that it need not behave like a media company that led to the easy proliferation of fake news that shook up the 2016 elections, Katz said.

"That's exactly what the Russians were counting on," she said.

Share this articleComments

You might also like

Pope warns against 'fake news'