Facebook Has a Rule to Stop Calls to Arms. Moderators Didn't Enforce It Ahead of the Kenosha Shootings. |
Written by <a href="index.php?option=com_comprofiler&task=userProfile&user=56054"><span class="small">Ryan Mac and Craig Silverman, BuzzFeed</span></a> |
Tuesday, 17 November 2020 14:03 |
Excerpt: "In August, following a Facebook event at which two protesters were shot and killed in Kenosha, Wisconsin, Mark Zuckerberg called the company's failure to take down the event page asking militant attendees to bring weapons 'an operational mistake.'" Facebook Has a Rule to Stop Calls to Arms. Moderators Didn't Enforce It Ahead of the Kenosha Shootings.17 November 20
n August, following a Facebook event at which two protesters were shot and killed in Kenosha, Wisconsin, Mark Zuckerberg called the company’s failure to take down the event page asking militant attendees to bring weapons “an operational mistake.” There had been a new policy established earlier that month “to restrict” the ability of right-wing militants to post or organize in groups, Facebook’s CEO said, and under that rule, the event page should have been removed. BuzzFeed News has learned, however, that Facebook also failed to enforce a separate year-old call to arms policy that specifically prohibited event pages from encouraging people to bring weapons to intimidate and harass vulnerable individuals. Three sources familiar with the company’s content moderation practices told BuzzFeed News that Facebook had not instructed third-party content moderators to enforce a part of its call to arms policy that was first established in June 2019. “What we learned after Kenosha is that Facebook’s call to arms policy didn’t just fail,” said Farhana Khera, the executive director of Muslim Advocates, a civil rights organization that pressured Facebook to create the rule. “It was never designed to function properly in the first place.” Facebook’s lack of enforcement around its call to arms policy exacerbated its failure to prevent violence in Kenosha, where, on the night of Aug. 25, armed right-wing militants heeded a call on Facebook to counterprotests against the police shooting of a 29-year-old Black man, Jacob Blake. It also shows that while Facebook touts its stated policies, its enforcement of those rules can be haphazard and hamstrung by internal restrictions that render its army of contract moderators unable to act in the face of dangerous content and organizations. On Monday, 15 Democratic senators sent a letter to Zuckerberg that condemned Facebook for its abetting of violence and hate speech against Muslims, as well as its failure to enforce the call to arms policy. The year-old rule was created in large part due to pressure from Muslim advocacy groups, which since 2015 had flagged multiple instances where organizers of Facebook events had advocated for followers to bring weapons to mosques and other places of worship. “We understand that the contractors who review user-reported content are not instructed to enforce a core component of the call to arms policy,” Sens. Chris Coons, Elizabeth Warren, Bernie Sanders, and others wrote to Zuckerberg. “It is not apparent that Facebook ensures meaningful enforcement of this policy, and that is not acceptable.” Two sources who spoke on the condition of anonymity told BuzzFeed News that after discussions with Facebook, they learned that the company only selectively enforced its call to arms policy. If, for example, an event page specifically asked people to bring guns to a place of worship or other high-risk locations, a third-party content moderator had the ability to disable that event. If, however, an event page asked people to bring guns but didn’t explicitly intimidate a protected group or target a high-risk area, third-party moderators could not take action or rule against the page, those people said. “Our global team of content reviewers are trained to enforce our policies including against hate speech, violence and incitement and dangerous organizations, but certain elements of our policies require additional context and expertise to enforce and in those cases, it is our specialized teams that review this content and take action accordingly,” said Facebook spokesperson Liz Bourgeois. She did not specifically say why the company’s "specialized teams" did not take action on the posts calling for people to bring weapons to Kenosha. Do you work at Facebook or another technology company? We’d love to hear from you. Reach out at This e-mail address is being protected from spambots. You need JavaScript enabled to view it , This e-mail address is being protected from spambots. You need JavaScript enabled to view it , or via one of our tip line channels. The lack of enforcement on a portion of the call to arms policy, which was introduced in June 2019 as the company was undergoing a civil rights audit, is part of the reason why the Kenosha Facebook event stayed online long after it led to the shooting deaths of two protesters. As BuzzFeed News previously reported, Facebook users had flagged the event page, “Armed Citizens to Protect our Lives and Property,” 455 times ahead of its start date on the night of Aug. 25, but four moderators had deemed it “non-violating” of any Facebook rules. During a companywide meeting in late August following the night of violence, Zuckerberg acknowledged the company had made a mistake in not taking the event down sooner, particularly because it violated a rule labeling right-wing militant groups as “Dangerous Individuals and Organizations” for their celebrations of violence. The company did not catch the page despite user reports, Zuckerberg said, because the complaints had been sent to content moderation contractors who were not versed in “how certain militias” operate. “On second review, doing it more sensitively, the team that was responsible for dangerous organizations recognized that this violated the policies and we took it down.” Zuckerberg made no mention of the call to arms policy, part of which Facebook had not instructed moderators to enforce, according to Khera. She told BuzzFeed News that shortly after the shooting deaths in Kenosha, Muslim Advocates held a call with members of Facebook’s policy, content, and dangerous organizations teams to understand what had happened. “Based on Facebook’s response, it became clear not only that the call to arms policy should have applied to the Kenosha Guard event page, but also that the people whose job was to receive complaints about the page were not trained to enforce the policy and did not know they were supposed to escalate complaints about a call to arms,” Khera said. “As a result, when the complaints started coming in about Kenosha, the content reviewers effectively denied them and did not escalate anything.” Facebook is now facing a lawsuit from Black Lives Matter protesters and the partner of a man who was killed in Kenosha. That suit, which was filed in September in Wisconsin federal courts, argues that Facebook was negligent in allowing the militant group Kenosha Guard to persist on its platform and ultimately create an event page. In previous statements, Facebook said the alleged Kenosha shooter did not RSVP to the Kenosha Guard’s event and did not follow its page. It remains unclear if he viewed the page or otherwise knew about the event. “It seems like only a matter of time before this happens again,” said Joan Donovan, the research director of the Shorenstein Center on Media, Politics and Public Policy. “Facebook must confront its business model that favors scale over public health and well-being.” On Tuesday, Zuckerberg and Twitter CEO Jack Dorsey are expected to testify in a Senate Judiciary hearing focused on exploring censorship and platform moderation on the world’s largest social networks. With their letter, the 15 Democratic senators are signaling Facebook’s leader may be in for plenty of questioning — and criticism — during the event. “As members of Congress who are deeply disturbed by the proliferation of this hate speech on your platform, we urge you to do more,” they wrote. Harvard lecturer and content moderation researcher evelyn douek commended the senators’ letter for attempting to highlight the gap between Facebook’s policy announcements and its applications of those policies. “We've gotten to a place where for many of the major platforms, their policies on paper are mostly fine and good,” she said. “But the question is always, always ‘Can and will they enforce them effectively?’” |