It's one of the defining phrases to emerge the 2016 election. But before "fake news" became President Donald Trump's favorite media-bashing epithet, the term referred to fake political news spread by dubious partisan sites for profit.
In the months leading up to the presidential election, news sites and Facebook pages attracted attention for producing wholly fictitious articles like"Pope Francis Shocks World, Endorses Donald Trump For President" that preyed on readers' biases. In Macedonia, enterprising teenagers built a cottage industry of fake news sitesthat concocted phony scoops about President Obama and Hillary Clinton and then collected the ad revenue when they went viral among Trump supporters.
This month, three political scientists released a major study on the spread of fake news in 2016. Using a combination of surveys and software that tracked which sites participants visited on their computer, they mapped the spread of viral sites between October 7 and November 14, 2016. Based on their findings, over 27 percent of adults visited an article on a pro-Trump or pro-Clinton site that had been identified as a fake news hub. These visits made up 2.6 percent of all articles they consumed.
To unpack the findings, NBC News talked to Brendan Nyhan, a Dartmouth professor who co-authored the study with Princeton professor Andrew Guess and University of Exeter professor Jason Reifler. Below is a transcript of our conversation, edited for length and clarity.
NBC News: Tell me a little about why you thought this phenomenon was so important to research.
NYHAN: Like everyone else, we were surprised after the 2016 election by the apparent reach of fake news websites. Craig Silverman's reportingsuggested they had been shared millions of times on Facebook. People were speculating that the prominence of fake news meant that misinformation had become more prominent in our politics and that it might have changed the outcome of the election.
We had a study in the field before the election focusing on fact checking, but that study measured participants' candidate preference and allowed us to observe anonymized browsing behavior. We realized we could leverage that data to provide the first direct measurement of fake news exposure. The speculation has run far ahead of the available data and we wanted to hopefully inform that conversation.
Our data suggests a different story about the problem fake news exposure reveals. Fake news is not a problem of potential swing voters being misled. It reflects the potential for people on the extremes to be trapped in echo chambers that aren't just reinforcing their opinions, but providing them with false and misleading factual claims that seem to reinforce those opinions.
You found that fake news visit were highly concentrated among a certain set of older conservatives. Tell me a little about this group.
The group that visited fake news websites the most frequently was the 10% of Americans with the most conservative information diets. These are people who are visiting a lot of websites that are disproportionately consumed and shared by conservatives relative to liberals. It's not that fake news is a substitute for political news, it's being added on to a diet of like-minded political news of a more conventional variety.
For instance, if you look at how often people in that group visit Fox News, it's much more on average than they visit fake news websites. And I'm not equating Fox News to fake news websites of the sort we observed, I'm just illustrating that these are people consuming a lot of political news in general. We find no evidence that fake news crowds out political news.
So they're more 'news junkies' rather than low-information voters trying to make up their mind the week before an election?
Right. They're a subset of Americans who follow politics extremely closely. For all the saturation coverage that politics receives, its still a pretty small part of the news diet for most people.
It's important to be clear that people probably were having chance encounters with fake news here and there, but we all have chance encounters with all sorts of information on Facebook. We forget or don't even process a huge fraction of it. I know of no evidence to suggest those encounters are changing people's votes.
After conducting this research, I'm much more concerned about how fake news entrepreneurs could increase affective polarization among people in the extremes. That means increasing negative sentiments towards the other political party, which has been one of the most unfortunate trends in recent years.
The New York Times described your study as finding fake news had "little impact." I'll admit I had the opposite reaction. For one thing, the group you're describing -- hyper-political older conservatives -- sounds like the most important force in the Republican primaries.
This is a tricky question. Fake news really takes off during the general election as more Americans start to tune in.
Fake news was a relatively small part of the information people consumed, which might suggest it didn't change the outcome of the election. And to be clear, I know of no evidence to say it changed the outcome of the election.
This is not to say the audience it reaches isn't politically influential -- it is. They consume a lot of political news, so they help drive the political media. They vote in primaries, they write letters to their members of Congress, they show up in meetings, and they turn out to vote. It's just important to recognize they're a relatively small part of the public and their vote choice is unlikely to be changed by information reinforcing what they already believe.
It feels like there's a connection between having an active portion of a party that's prone to seeking false stories and conspiracies and a president who has famously spread conspiracies and false claims. In many ways, demographically and ideologically, the president fits the profile of the fake news users that you're describing.
It's worrisome if fake news websites further weaken the norm against false and misleading information in our politics, which unfortunately has eroded. But it's also important to put the content provided by fake news websites in perspective. People got vastly more misinformation from Donald Trump than they did from fake news websites -- full stop. That's not to say Hillary Clinton was innocent or her statements were always accurate, that's not true either. But there's been a disproportionate focus on fake news in thinking about where people got information about politics and the campaign.
One other reason I think fake news is worrisome is that it was a new threat to our public debate that signaled a weakness in the marketplace of ideas. Our social media platforms were enabling the spread of false and misleading information in a way that we hadn't seen before. They created a market that was sufficiently attractive that people all around the world formed businesses to profit from our polarized politics. That very worrisome and that's a trend we have to address or more entrepreneurs will enter that market in 2018 and 2020.
Your research seemed to find that fake news sites were largely a Facebook-driven phenomenon.
Yes. Facebook stands out in our data as the site people visited most disproportionately prior to visiting a fake news website. We don't observe the same pattern with Google, Twitter, or web mail platforms. Journalists love to talk about Twitter, but it just doesn't compare in reach to Facebook.
Facebook has said they are trying to prevent fake news with new measures, like adding related fact checks to disputed stories. In general, do you think they're taking an effective approach?
I think Facebook is moving in a positive direction. Their partnership with fact checkers was an important step to provide fact checking information at the time people need it and it avoids Facebook being in the business of determining what information is shown to people directly.
Our data shows people almost never see fact checks of the dubious claims they're encountering online. It may be the case that lots of people read fact checks and lots of people read fake news, but almost no one reads fact checks of the fake news they've been exposed to. Facebook can address that problem.
They're also making changes on the publisher side which could hopefully disincentive fake news-based businesses which sprung up to exploit the profit opportunities that were previously there. Changing the business model is a more feasible goal than overcoming polarization or changing human psychology.