The possibility of inflating fear of disease to keep people at home on Election Day shows how disinformation campaigns can potentially sidestep tech companies' safeguards.
Internet trolls backed by foreign governments interested in meddling in the U.S. election could try to exaggerate the threat of the coronavirus to keep Americans from going to the polls in upcoming elections, according to government officials and experts.
The coronavirus has infected more than 83,000 people around the world and caused nearly 3,000 deaths, mostly in China. On Tuesday, the Centers for Disease Control and Prevention warned that it expected the virus would spread further in the United States.
But while public concerns about the coronavirus are legitimate, they're also an extremely ripe target for disinformation campaigns. Russia-linked accounts have been pushing coronavirus conspiracy theories online, the Department of State said in a rare public attribution earlier this week, with false claims that the U.S. created the virus.
"I think it's possible that Russia would use any pandemic event or any fears on Election Day to try to dampen turnout," said Nina Jankowicz, a disinformation fellow at the Wilson Center, a nonprofit policy research organization. "If coronavirus is still active in November, Russia doesn't even necessarily need to spread fears about coronavirus on Election Day. It could just spread this information ahead of the elections about coronavirus in locations like polling places or other public places."
The possibility that fears about a pandemic could be weaponized to keep voters at home is "one of a number of scenarios" that federal election security officials are considering, Chris Krebs, the director of the Department of Homeland Security's cybersecurity division, told NBC News.
In a warning sent to election officials in the fall, the FBI specifically cautioned that Russia "might seek to covertly discourage or suppress U.S. voters from participating in next year's election."
Since the leadup to the midterm elections of 2018, Facebook and Twitter have had policies to remove explicit misinformation about how to vote, like even joking claims that one political party is supposed to vote the day after Election Day; or that it's possible to vote by text message.
But the possibility of inflating fear of disease to keep people at home on Election Day — one that doesn't necessarily need to mention the polls — shows how disinformation campaigns can potentially sidestep tech companies' safeguards.
Misinformation about health scares are trickier. Last year, Facebook implemented a policy to let fact-checkers examine health claims and limit their ability to spread if they're found to be misleading, though some cases slip through the cracks. In January, the company said it would go a step further with coronavirus conspiracy theories by removing them, though only if they'd been debunked by health authorities.
Facebook has gamed out hypotheticals about a fictionalized or an emergency situation on Election Day, according to Nathaniel Gleicher, Facebook's head of cybersecurity policy.
"There's always a hypothetical risk that there's something else out there," Gleicher said, adding that while the company is confident in its current takedown procedures, it also employs an emergency task force to make emergency decisions on Election Day.
Russian propagandists have a long history of spreading and amplifying conspiracy theories in the U.S. about existing diseases. In the early 1980s, the KGB infamously undertook a campaign to spread the false claim that the U.S. government had created AIDS — which was already an organic, domestic conspiracy theory — with tactics like writing an anonymous newspaper op-ed claiming to be from a whistleblowing American scientist who claimed it was a Pentagon biological weapon gone awry.
More recently, the Internet Research Agency, the Russian "troll factory" indicted in 2018 by the former special counsel Robert Mueller for alleged crimes related to influence operations against Americans ahead of the 2016 election, has spent some of its efforts exaggerating the threat of contagious disease. As early as 2014, the agency posted a YouTube video, since removed by Google, that falsely purported to show an Ebola victim being wheeled through the parking lot of the Hartsfield-Jackson Atlanta International Airport.
"It's hard to predict what fissures in our society Russia or other bad actors will use to manipulate us next," Jankowicz said. "I don't think there's a sure thing in this business."