Inside Facebook’s struggle to contain insurrectionists’ posts

Source: Politico | October 25, 2021 | Alexandra S. Levine

Facebook’s rules left giant holes for U.S. election falsehoods to metastasize. On the day of the Capitol riot, employees began pulling levers to try to stave off the peril.

In the days and hours leading up to the Jan. 6 Capitol insurrection, engineers and other experts in Facebook’s Elections Operations Center were throwing tool after tool at dangerous claims spreading across the platform — trying to detect false narratives of election fraud and squelch other content fueling the rioters.

But much of what was ricocheting across the social network that day fell into a bucket of problematic material that Facebook itself has said it doesn’t yet know how to tackle.

Internal company documents show Facebook had no clear playbook for handling some of the most dangerous material on its platform: content delegitimizing the U.S. elections. Such claims fell into a category of “harmful non-violating narratives” that stopped just short of breaking any rules. Without set policies for how to deal with those posts during the 2020 cycle, Facebook’s engineers and other colleagues were left scrambling to respond to the fast-escalating riot at the Capitol — a breakdown that triggered outrage across the company’s ranks, the documents show.

“How are we expected to ignore when leadership overrides research based policy decisions to better serve people like the groups inciting violence today,” one employee asked on a Jan. 6 message board, responding to memos from CEO Mark Zuckerberg and CTO Mike Schroepfer. “Rank and file workers have done their part to identify changes to improve our platform but have been actively held back.”

The documents include disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by the legal counsel for Frances Haugen, a Facebook whistleblower who left the company four months after the Jan. 6 violence. The redacted versions were reviewed by a consortium of news organizations, including POLITICO.

Facebook for years has been collecting data and refining its strategy to protect the platform and its billions of users, particularly during post-election periods when violence is not uncommon. The company has taken added precautions in parts of the world such as Myanmar and India, which have seen deadly unrest during political transitions, including using “break the glass” measures — steps reserved for critical crises — to try to thwart real-world harm.

Yet even with those insights, Facebook did not have a clear plan for addressing much of the activity that led to violence following the 2020 U.S. presidential election.

……..

Viewing 1 post (of 1 total)
  • Discussion
  • Consistent #50552

Viewing 1 post (of 1 total)

You must be logged in to reply to this topic.