On Tuesday morning, the political and tech worlds were startled to learn that Facebook had identified and stopped a new covert campaign to spread divisive political messages on its platform, the first such announcement since 2017.
But Facebook released only some of the pages and content publicly. What it did not reveal was the depth some of the pages went to stoke racial tension and incite division among Americans. Facebook deliberately did not reveal some of the most divisive content, which was deleted.
NBC News was able to retrieve some of the pages Facebook deleted via a web archive search, which allows people to see internet pages that have been deleted. A review of some of the deleted pages from groups identified by Facebook as part of the "inauthentic coordinated behavior" found efforts to target people based on liberal politics as well as Hispanic and African heritage.
One deleted post called for protesters to occupy the headquarters of the U.S. Department of Homeland Security's Immigration and Customs Enforcement agency.
An event that was initially titled "Stop Ripping Families Apart! DC," posted by a group called "Resisters," was later retitled "Stop Ripping Families Apart! Take over ICE HQ" once more users said they would attend. On Facebook, 131 people marked themselves as having attended the June 27 rally outside of ICE's Washington offices.
Another group, "Aztlan Warriors," posted an image stoking racial tension around Mexican child labor during the Great Depression, calling it a "common place sight"
"This was exacerbated by racial segregation: Mexican and Mexican American children… were often not allowed in white schools," read text around the image.
Though the precise identity of the groups who wrote these posts was not made public, there is evidence that they were not posted by Hispanic groups, or liberal activists who have been critical of ICE. Instead, they bear similarity to the Russian trolls who tried to divide voters in the 2016 election.
The racial rhetoric of the deleted pages echoes the paid advertisements that Facebook revealed in 2017 that the company said were purchased by the Kremlin-linked Internet Research Agency "troll farm."
The goal seems to be the same: to sow discord and pit American voters against each other, particularly along racial or ethnic lines.
Facebook declined to identify the source of the campaign but said it bore similarities to the efforts connected to Russia's Internet Research Agency around the 2016 election.
Graham Brookie, director of the Digital Forensic Research Lab at the Atlantic Council, a think tank focused on international relations, said that whoever put together the Facebook pages learned lessons from the takedown of suspected Russian pages last year.
"The tactics are adapting, which makes detection increasingly difficult," he said. "These accounts were harder to detect. There were less telltale signs that this was a disinformation operation."
The Atlantic Council partnered with Facebook in May to help identify threats on the social network and was granted access to the deleted posts that have not been released publicly, allowing the group to publish an initial analysis including samples.
The lab is planning to issue a longer report at a later date. It learned of the pages on Monday from Facebook, but Brookie said that the campaign showed that misinformation campaigns on the platform can still be effective.
People who run disinformation campaigns benefit whether they are discovered or not, Brookie said. Either they go undetected and can sow discord, or they are revealed and people begin questioning the legitimacy of all people in a debate, he said.
That appeared to have already happened on Monday, with one organizer of a counterprotest for next month's white supremacist rally clarifying on Twitterthat entire event was not a Russian front.
"If your overarching goal is to drive Americans further away from each other or sow discord generally, then this is achieving that," Brookie said. "Even if a small disinformation operation is extremely successful, it has the potential to poison the well. You can have outsized impact even if you're a small operation because it undercuts trust in our political discourse."
Clint Watts, a former FBI special agent and MSNBC contributor, said the page administrators typically draw in followers with generic messages around group identity, then the page owners start hitting them with the most divisive posts once a group's identity has been formed, helping to push its followers to the extremes.
"They were hoping to instigate a conflict along racial issues," Watts said. "The Kremlin seeks to infiltrate audiences along any and all divisive social issues, then once the audience is won, push them politically."
"The goal is to create fear in the audience that things are unstable and that democracy and its institutions are failing."
David Ingram contributed reporting.