WASHINGTON – Do social media echo chambers deepen political polarization, or just mirror present social divisions?
A landmark analysis challenge that investigated Facebook across the 2020 US presidential election revealed its first outcomes Thursday, discovering that, opposite to assumption, the platform’s usually criticized content-ranking algorithm does not form customers’ beliefs.
The work is the product of a collaboration between Meta — the father or mother firm of Facebook and Instagram — and a bunch of teachers from US universities who got broad entry to inside firm information, and signed up tens of 1000’s of customers for experiments.
The tutorial workforce wrote 4 papers analyzing the position of the social media large in American democracy, which have been revealed within the scientific journals Science and Nature.
Overall, the algorithm was discovered to be “extremely influential in people’s on-platform experiences,” mentioned challenge leaders Talia Stroud of the University of Texas at Austin and Joshua Tucker, of New York University.
In different phrases, it closely impacted what the customers noticed, and the way a lot they used the platforms.
“But we also know that changing the algorithm for even a few months isn’t likely to change people’s political attitudes,” they mentioned, as measured by customers’ solutions on surveys after they took half in three-month-long experiments that altered how they obtained content material.
The authors acknowledged this conclusion could be as a result of the modifications weren’t in place for lengthy sufficient to make an impression, provided that the United States has been rising extra polarized for many years.
Nevertheless, “these findings challenge popular narratives blaming social media echo chambers for the problems of contemporary American democracy,” wrote the authors of one of many papers, revealed in Nature.
‘No silver bullet’
Facebook’s algorithm, which makes use of machine-learning to determine which posts rise to the highest of customers’ feeds primarily based on their pursuits, has been accused of giving rise to “filter bubbles” and enabling the unfold of misinformation.
Researchers recruited round 40,000 volunteers by way of invites positioned on their Facebook and Instagram feeds, and designed an experiment the place one group was uncovered to the conventional algorithm, whereas the opposite noticed posts listed from latest to oldest.
Facebook initially used a reverse chronological system and a few observers have advised that switching again to it’ll scale back social media’s dangerous results.
The workforce discovered that customers within the chronological feed group spent round half the period of time on Facebook and Instagram in comparison with the algorithm group.
On Facebook, these within the chronological group noticed extra content material from average mates, in addition to extra sources with ideologically blended audiences.
But the chronological feed additionally elevated the quantity of political and untrustworthy content material seen by customers.
Despite the variations, the modifications didn’t trigger detectable modifications in measured political attitudes.
“The findings suggest that chronological feed is no silver bullet for issues such as political polarization,” mentioned coauthor Jennifer Pan of Stanford.
Meta welcomes findings
In a second paper in Science, the identical workforce researched the impression of reshared content material, which constitutes greater than 1 / 4 of content material that Facebook customers see.
Suppressing reshares has been advised as a way to regulate dangerous viral content material.
The workforce ran a managed experiment by which a bunch of Facebook customers noticed no modifications to their feeds, whereas one other group had reshared content material eliminated.
Removing reshares decreased the proportion of political content material seen, leading to decreased political information — however once more didn’t impression downstream political attitudes or behaviors.
A 3rd paper, in Nature, probed the impression of content material from “like-minded” customers, pages, and teams of their feeds, which the researchers discovered constituted a majority of what the whole inhabitants of energetic grownup Facebook customers see within the US.
But in an experiment involving over 23,000 Facebook customers, suppressing like-minded content material as soon as extra had no impression on ideological extremity or perception in false claims.
A fourth paper, in Science, did nevertheless verify excessive “ideological segregation” on Facebook, with politically conservative customers extra siloed of their news sources than liberals.
What’s extra, 97 % of political news URLs on Facebook rated as false by Meta’s third-party truth checking program — which AFP is a part of — have been seen by extra conservatives than liberals.
Meta welcomed the general findings.
They “add to a growing body of research showing there is little evidence that social media causes harmful… polarization or has any meaningful impact on key political attitudes, beliefs or behaviors,” mentioned Nick Clegg, the corporate’s president of world affairs. — Agence France-Presse
Source: www.gmanetwork.com