Research Shows Facebook’s Algorithm Has No Impact on People’s Beliefs

Research Shows Facebook’s Algorithm Has No Impact on People’s Beliefs

Meta, whose corporate offices in Menlo Park, California, are seen here, ⁢welcomed the⁤ research
AFP

Do social media ‍echo chambers deepen political polarization, or simply reflect existing social divisions?

A landmark research project that investigated Facebook around the 2020 US presidential election published its⁣ first results Thursday, finding that, contrary to assumption, the platform’s often criticized content-ranking ‍algorithm doesn’t shape⁤ users’ beliefs.

The work is the product of a collaboration between Meta — the parent company of Facebook and⁢ Instagram — and a⁢ group of academics ​from US universities who were given broad access to internal company data, and signed up tens of thousands of users ⁢for experiments.

The academic⁢ team wrote four papers examining the role of the social media giant⁤ in American democracy, ⁤which were published in the scientific journals Science and ‌Nature.

Overall, the algorithm was found to be “extremely‌ influential in people’s on-platform experiences,” said project leaders ⁤Talia Stroud of the University ⁢of Texas at Austin and Joshua Tucker, of ⁣New York University.

In other words, it heavily impacted what the users saw, and how much they used the platforms.

“But we also⁣ know ⁣that changing the algorithm for even‌ a few‌ months isn’t​ likely to ⁣change people’s ​political attitudes,” they said, as measured by users’ answers on ⁢surveys after they took part in three-month-long experiments that altered⁢ how they received content.

The authors acknowledged this conclusion might be because the changes weren’t‍ in place for long enough⁤ to make an impact, given that the United⁣ States has been growing ​more polarized for decades.

Nevertheless, “these findings challenge popular narratives⁤ blaming social media echo chambers for the problems of contemporary American democracy,” wrote the authors of one of the papers, published in Nature.

Facebook’s algorithm,​ which uses machine-learning to decide which posts rise to the top ⁢of users’ feeds based on their interests, has been accused of giving rise to “filter bubbles” and enabling the spread ⁤of misinformation.

Researchers recruited around 40,000 volunteers ‍via invitations placed on their Facebook and Instagram feeds,⁣ and designed an experiment ​where one group was exposed to the normal algorithm, while​ the other saw posts listed from newest to ‍oldest.

Facebook originally used a reverse‌ chronological system and some observers have suggested that ⁢switching ⁢back to it will reduce social media’s harmful effects.

The team found that users in the chronological feed group spent ⁤around half the amount of time ​on Facebook and Instagram compared to the algorithm group.

On Facebook, those in the chronological group saw more content from moderate‍ friends,‌ as‍ well as more sources⁤ with ideologically mixed audiences.

But the chronological feed also increased the amount of political and untrustworthy content seen by users.

Despite the differences, the changes did not cause detectable changes ⁢in measured political attitudes.

“The findings suggest that chronological feed is no silver bullet for issues such as political polarization,” said coauthor Jennifer Pan of Stanford.

In a second paper in ‌Science, the same team researched the impact of reshared content, which constitutes more than a quarter of content that Facebook users ‍see.

Suppressing reshares has been suggested as a ⁢means to control harmful viral ​content.

The‌ team ran a controlled experiment in which a group of Facebook‍ users saw no changes to their feeds, while another group had reshared content ⁤removed.

Removing reshares reduced the proportion of political content⁣ seen, resulting in reduced political knowledge — but again did not impact downstream political​ attitudes or behaviors.

A⁤ third paper, in Nature,‌ probed the impact of content⁢ from “like-minded” ⁤users,⁤ pages, and groups in their feeds, which the researchers found constituted a majority of what the entire population of active adult Facebook users ​see in the US.

But ​in an experiment involving over 23,000 Facebook users, suppressing like-minded content ‌once more had no ‌impact on ideological extremity or belief⁤ in false claims.

A fourth paper, in⁢ Science, did however confirm extreme “ideological segregation” on Facebook, with politically conservative users more siloed in their news sources​ than liberals.

What’s more, 97 percent of political news ‍URLs on Facebook ⁤rated as false by Meta’s third-party‍ fact checking program — ​which AFP is part of — were seen by more conservatives than liberals.

Meta welcomed the overall findings.

They “add to a growing body of research showing there​ is little evidence that social media causes harmful… polarization or has any meaningful impact on key political attitudes, beliefs or behaviors,” said Nick Clegg, the company’s president ​of ​global affairs.

Facebook
Instagram

2023-07-29 07:24:03
Article from www.ibtimes.com
​rnrn

Exit mobile version