Facebook hits pause on algorithmic recommendations for political and social issue groups
With just days to go before the U.S. election, Facebook quietly suspended one of its most worrisome features.
During Wednesday’s Senate hearing, Senator Ed Markey asked Facebook CEO Mark Zuckerberg about reports that his company has long known its group recommendations push people toward more extreme content. Zuckerberg responded that the company had actually disabled that feature for certain groups — a fact Facebook had not previously announced.
“Senator, we have taken the step of stopping recommendations in groups for all political content or social issue groups as a precaution for this,” Zuckerberg told Markey.
TechCrunch reached out to Facebook with questions about what kind of groups would be affected and how long the recommendations would be suspended at the time but did not receive an immediate response. Facebook first confirmed the change to BuzzFeed News on Friday.
“This is a measure we put in place in the lead up to Election Day,” Facebook spokesperson Liz Bourgeois told TechCrunch in an email. “We will assess when to lift them afterwards, but they are temporary.”
The cautionary step will disable recommendations for political and social issue groups as well as any new groups that are created during the window of time. Facebook declined to provide additional details about the kinds of groups that will and won’t be affected by the change or what went into the decision.
Researchers who focus on extremism have long been concerned that algorithmic recommendations on social networks push people toward more extreme content. Facebook has been aware of this phenomenon since at least 2016, when an internal presentation on extremism in Germany observed that “64% of all extremist group joins are due to our recommendation tools.” In light of the feature’s track record, some anti-hate groups celebrated Facebook’s decision to hit the pause button Friday.
“It’s good news that Facebook is disabling group recommendations for all political content or social issue groups as a precaution during this election season. I believe it could result in a safer experience for users in this critical time,” Anti-Defamation League CEO Jonathan A. Greenblatt told TechCrunch. “And yet, beyond the next week, much more needs to be done in the long term to ensure that users are not being exposed to extremist ideologies on Facebook’s platforms.”
On Facebook, algorithmic recommendations can usher users flirting with extreme views and violent ideas into social groups where their dangerous ideologies can be amplified and organized. Before being banned by the social network, the violent far-right group the Proud Boys relied on Facebook groups for its relatively sophisticated national recruitment operation. Members of the group that plotted to kidnap Michigan Governor Gretchen Whitmer also used Facebook groups to organize, according to an FBI affidavit.
While it sounds like Facebook’s decision to toggle off some group recommendations is temporary, the company has made an unprecedented flurry of choices to limit dangerous content in recent months, possibly in fear that the 2020 election will again plunge it into political controversy. Over the last three months alone, Facebook has cracked down on QAnon, militias and language used by the Trump campaign that could result in voter intimidation — all surprising postures considering its longstanding inaction and deep fear of decisions that could be perceived as partisan.
After years of relative inaction, the company now appears to be taking seriously some of the extremism it has long incubated, though the coming days are likely to put its new set of protective policies to the test.
✍ Credit given to the original owner of this post : ☕ Social – TechCrunch
🌐 Hit This Link To Find Out More On Their Articles...🏄🏻♀️ Enjoy Surfing!
Post a Comment