Changes to social media feeds shapes content but not political views, study finds

This photo shows the mobile phone app logos for, from left, Facebook and Instagram
This photo shows the mobile phone app logos for, from left, Facebook and Instagram Copyright Richard Drew/AP Photo, File
Copyright Richard Drew/AP Photo, File
By Lauren Chadwick
Share this articleComments
Share this articleClose Button

Researchers tweaked the social media algorithms for tens of thousands of consenting users to see if it impacted their political views during the 2020 US election.

ADVERTISEMENT

Changes to social media algorithms on Facebook and Instagram did not significantly alter users' political views but they did shape what they see on their feeds, according to new in-depth research on the platforms' impact on political polarisation.

The first results from a multi-university team working in tandem with Meta researchers provided a look at the impact of social media algorithms on what users see.

Co-led by Talia Stroud of The University of Texas at Austin and Joshua Tucker of New York University, the study found that tweaking the algorithms, which help to rank news and search items on social media, influenced what people saw on their feeds.

But it did not necessarily change their political beliefs.

"We now know just how influential the algorithm is in shaping people’s on-platform experiences, but we also know that changing the algorithm for even a few months isn’t likely to change people’s political attitudes," Stroud and Tucker said in a joint statement.

"What we don’t know is why. It could be because the length of time for which the algorithms were changed wasn’t long enough, or these platforms have been around for decades already, or that while Facebook and Instagram are influential sources of information, they are not people’s only sources," they said.

The new research, part of the 2020 Facebook and Instagram Election Study (FIES), was published in a series of four papers in Science and Nature on Thursday.

Despite Meta's involvement, the academic researchers said they had the final say over the writing and research decisions.

Experimenting with different social media feeds

The researchers experimented with three changes to how Facebook and Instagram users viewed content during the 2020 US presidential election.

The experiments with tens of thousands of consenting US-based users included stopping re-shares, changing the feed from an algorithmic to a chronological feed, and reducing exposure to like-minded content.

Researchers found that removing re-shared content, for instance, decreased the amount of political news as well as overall clicks and reactions. It also decreased partisan news clicks, they said.

Meanwhile, changing to reverse chronological feeds instead of content picked based on an algorithm "significantly decreased" the time users spent on the platform.

In the third experiment, researchers reduced thousands of consenting users' like-minded content by a third on the platform, which they said increased exposure to other sources but did not change users' ideologies.

"These precisely estimated results suggest that although exposure to content from like-minded sources on social media is common, reducing its prevalence during the 2020 US presidential election did not correspondingly reduce polarisation in beliefs or attitudes," the researchers said in the paper published in Nature.

In one of the papers published in Science, the researchers analysed data from 208 million US Facebook users about their news consumption on the social media platform during the election.

They found a large ideological segregation between right-wing and left-wing audiences in the US, "with a substantial corner of the news ecosystem consumed exclusively by conservatives".

"Most misinformation, as identified by Meta’s Third-Party Fact-Checking Program, exists within this homogeneously conservative corner," the study authors said.

'Avoiding accountability'

Social media has long been criticised for stoking ideological polarisation but Facebook has disputed its role in this.

ADVERTISEMENT

In a statement on Thursday, Meta's global affairs president, Nick Clegg, wrote that these new studies added "to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarisation, or have meaningful effects on key political attitudes, beliefs or behaviours."

Free Press, a US non-profit organisation advocating for media reform, said Meta was misrepresenting the studies, adding that they were limited and occurred over a "narrow time period".

"Meta execs are seizing on limited research as evidence that they shouldn’t share blame for increasing political polarisation and violence," said Nora Benavidez, Free Press’ senior counsel and director of digital justice and civil rights.

"This calculated spin of these surveys is simply part of an ongoing retreat from liability for the scourge of political disinformation that has spread online and undermined free, fair and safe elections worldwide."

Additional sources • AP

Share this articleComments

You might also like