Changes Made by Facebook to Fact-checking Controls for US Users

Changes Made by Facebook to Fact-checking Controls for US Users

Facebook is​ giving ⁢users ⁢more control​ over their feed.
AFP

Meta-owned Facebook‍ has handed US⁤ users ⁣the ⁢controls over ⁢fact-checked content, in⁢ a potentially significant ‍move that ‍the platform says will ​give​ them more power⁣ over ⁣its algorithm but some⁣ analysts ​insist ‌could ⁢benefit ‍purveyors ‍of misinformation.

For years,‍ Facebook’s algorithm automatically⁣ moved ‍posts​ lower in⁢ the feed if they were‍ flagged by‌ one‌ of the platform’s ‍third-party fact-checking partners, including AFP, ‍reducing ⁣the visibility ⁤of false or misleading content.

Under a new​ “content reduced by ⁣fact-checking” ⁣option‍ that ‌now appears in‌ Facebook’s settings, users have flexibility‍ to ⁢make ⁤debunked⁤ posts appear higher or lower‍ in ⁤the ‍feed ‍or ⁢maintain the ⁣status ⁤quo.

Fact-checked posts can‍ be ⁤made less ⁤visible​ with an option⁤ called⁤ “reduce ‍more.” ‍That, ‌according to ⁢the platform’s settings,⁢ means​ the⁢ posts “may be‌ moved even ​lower⁢ in‍ feed ⁢so you ​may ⁤not ​see ‌them at all.”

Another​ option labeled⁣ “don’t ‌reduce”⁤ triggers the opposite effect,⁤ moving ⁣more⁤ of ‍this ​content higher​ in their feed,‍ making ‌it‌ more‍ likely to be seen.

“We’re giving people ⁢on Facebook even⁢ more⁢ power to⁤ control ⁢the algorithm ⁢that ⁢ranks‌ posts in‌ their ​feed,”⁢ a ‌Meta spokesman‌ told ‍AFP.

“We’re⁢ doing ‌this ⁣in ​response to users telling us that‍ they want⁤ a greater ability‌ to decide what they see on⁢ our apps.”

Meta rolled out the fact-checking ⁢option‌ in May,⁣ leaving many users to discover‌ it​ for themselves ​in ⁤the⁢ settings.

It comes⁤ amid⁤ a ⁢hyperpolarized political ‍climate​ in the United ⁢States⁢ that has made content moderation‍ on ‌social media platforms a⁤ hot-button issue.

Conservative‍ US advocates allege that the ⁤government has⁢ pressured ⁤or colluded⁤ with platforms ⁢such⁤ as Facebook⁢ and Twitter to‍ censor or ⁢suppress right-leaning content​ under ‍the guise⁤ of ⁢fact-checking.

On‍ Tuesday, ‍a‌ federal court⁢ in Louisiana⁣ restricted ⁣some ‌top officials and agencies of President Joe ⁣Biden’s administration from⁤ meeting‌ and ⁢communicating ‍with social ⁢media companies⁤ to ⁣moderate ‍their content.

Separately, ​misinformation researchers ‍from⁣ prominent ⁢institutions​ such⁤ as the​ Stanford​ Internet‍ Observatory‍ face a Republican-led ‍congressional inquiry as well as ​lawsuits‌ from⁤ conservative activists who ​accuse⁤ them of⁤ promoting censorship‌ –‍ a⁤ charge they ⁣deny.

The changes​ on⁤ Facebook come ⁢ahead‌ of⁣ the 2024⁢ presidential vote, when many researchers fear political falsehoods ‍could ‌explode across ⁢social media platforms.⁤ The move ‍has⁢ also⁤ prompted⁣ concern from‍ some⁣ analysts that it ⁤could be a boon for⁣ misinformation‍ peddlers.

“Downranking ⁢content ‌that⁤ fact-checkers​ rate‍ as problematic‌ is a ⁢central part of ‌Facebook’s anti-misinformation ​program,” ‌David‌ Rand,⁢ a professor at ​the Massachusetts Institute ⁣of Technology, told AFP.

“Allowing people to⁣ simply opt ‌out seems ⁢to really‌ knee-cap⁤ the ⁣program.”

Meta ​downplayed the ⁣concerns, saying ⁢it will⁢ still ‍attach labels ‌to content​ that⁣ is found‍ to be misleading ⁤or ‍false, ‌making it ‌clear that it was⁤ rated by one of its third-party ‌fact-checkers.‌ The company ‍said it ​was exploring whether to​ expand ‌this control to other⁢ countries.

“This ⁤builds on work⁣ that⁤ we’ve⁢ been ⁢doing for a long time ‍in ⁤this ‌area ⁣and⁣ will help ⁤to make user⁢ controls‍ on⁢ Facebook more‌ consistent with‌ the ‌ones ⁢that already ​exist on Instagram,” ‌Meta’s spokesman said.

Aside ​from ‌this ⁤control, ‍Facebook is also allowing users to ⁢decide⁣ the degree to ‍which they⁤ want ‍to‌ view “low⁤ quality content,”⁣ such⁣ as clickbait‍ and⁤ spam, and “sensitive⁣ content,”‌ including violent ‍or graphic posts, on the‌ platform.

The impact ​of the ‍changes, analysts say, is only likely to be ⁢known over ⁣time​ when more ‍users — ⁤especially those who‍ distrust professional‍ fact-checkers –‌ start ​tweaking ​their‍ settings.

Fact-checkers, who are not⁢ able to review⁣ every post on ​the ⁤mammoth platform, ⁢routinely ‌face an⁢ avalanche ⁢of⁣ online​ abuse from people who ‌dispute⁤ their⁢ ratings⁤ — sometimes⁤ even‍ when ‌they peddle ⁣blatantly false ‍or​ misleading ​information.

“Someone who ⁣dislikes or‌ distrusts‍ the role ⁢of ​fact-checkers‌ could use ‌it to try⁢ to‍ avoid​ seeing fact-checks,” Emma ⁣Llanso, from ⁣the ‌Center ‌for⁤ Democracy ​& ⁤Technology, told AFP.

Facebook, she‍ said,‍ should be⁢ researching ⁤and ​testing whether ⁣it‍ will ‌increase or decrease users’ “exposure to misinformation” before it rolls ​it out more⁤ widely around ‌the⁣ world.

“Ideally ​they ⁤should share the results⁢ of‌ that kind of ‌research in an​ announcement‌ about the‍ new feature,”‍ Llanso added.

Facebook

2023-07-17 05:00:03
Original from www.ibtimes.com

Exit mobile version