Will The Kyle Rittenhouse Verdict Change How Fb Treats Gun Violence—As soon as extra?
Kyle Rittenhouse looks back during his trial at the Kenosha County Courthouse on November 17, 2021. … [+]
Sean Krajacic / Pool / Getty Images
When Kyle Rittenhouse fatally shot two men and mutilated a third in August 2020, Facebook intervened relatively quickly. A day after the incident in Kenosha, Wisconsin, it removed its Facebook and Instagram accounts, began banning laudatory posts, and blocked its name from the apps’ search function.
The moves came as part of yet another new Facebook policy on violence and mass shootings that debuted the same week, though it’s unclear whether it fell before or after Rittenhouse shot these men. And as part of its decision to reduce Rittenhouse’s profile on the platform, the company officially referred to Rittenhouse as a “mass shooter”.
But the steps immediately sparked criticism from Facebook. Days later, an employee wrote in a post on Facebook’s internal Workplace forum: “If Kyle Rittenhouse had killed 1 people instead of 2, would it still be a mass shooting? Can we really consistently and objectively differentiate between support (not allowed) and discussion about whether it is treated fairly (allowed)? ”
The Workplace post continued, “Can we really do this objectively and on a large scale without making critical mistakes in both under-enforcement and over-enforcement?”
This comment really hits the nail on the head. Facebook has been embroiled in years of deliberation about what type of content to regulate and how to do it. It is harshly criticized by liberals for not doing enough and by conservatives for doing too much. This pulls it in both directions, which mostly neither side likes.
Lately she has been pushed to take a stronger stance on violent content and posts that could lead to violence that you believe could find universal support. It has not. And on Friday it got even more complicated for Facebook: A jury pronounced Rittenhouse not guilty and rekindled outcry from right-wing experts that Facebook had wrongly wrongly punished him. (His attorney had successfully convinced the jury that he had acted in self-defense that August evening in Kenosha, Wisconsin, a town engulfed by protests against the police shooting of 29-year-old black Jacob Blake.)
Facebook has long hesitated to make judgments about what belongs on its website – and when it has banned material such as violent content, it has not always been successful in preventing it from being displayed on its platform. One of the most dramatic examples: the March 2019 shooting in Christchurch in New Zealand, in which the shooter broadcast his attack via livestream on Facebook and YouTube. According to an internal Facebook report on the Christchurch shooter, no one reported the video to Facebook until 29 minutes after it started up, and no part of the video triggered Facebook’s automated moderation software. Facebook eventually stopped the feed, but it would spend the next day removing 1.5 million copies of the video. In response, Facebook changed a number of its policies regarding live video, including speeding up its software to review new live videos. (Before Christchruch, a broadcast would normally have taken 5 minutes before the broadcast noticed it; subsequent changes reduced it to around 20 seconds.)
As with many changes to Facebook’s policy, these have been reactive, and in the recent past Facebook has tried harder to keep up with current events on its platform. In August 2020, shortly after the Rittenhouse shootings, Facebook chief Mark Zuckerberg admitted the company had made a mistake by failing to disable a Facebook events page that encouraged a militia to form a militia in the same city in Wisconsin where Rittenhouse shot the three men. Facebook users reported the militia group 455 times before Facebook removed them. And then in January Facebook took action against posts related to the riots in the U.S. Capitol only after the uprising – although in the months after Joe Biden’s victory on Facebook a mood had blossomed that delegitimized the election, another internal Facebook shows -Report.
The Rittenhouse ruling raises entirely new questions. When should a “mass shooter” label be affixed to someone – before or after a lawsuit? Ever How exactly should Facebook prevent posts? Should it completely abolish mass rifle politics?
At the weekend, Facebook, which did not respond to requests for comment, was sent back again. It lifted his lock on searching for “Kyle Rittenhouse” on Facebook and Instagram, helping posts on Rittenhouse from right-wing media figures like Ben Shapiro and Dan Bongino attracted tens of thousands of comments, reshares and reaction emojis, the signals that then boost their posts higher up in users’ Facebook feeds. A Facebook group, American Patriots Appeal, is promoting a Rittenhouse T-shirt. It costs $ 27.99 and shows him crouched, GI Joe-style, with a semi-automatic rifle. A prominent sentence appears next to: “Kyle Rittenhouse did nothing wrong”.
The internal Facebook documents cited in this story come from documents that Facebook whistleblower Frances Haugen submitted to the SEC; blacked out versions went to Congress and a consortium of news organizations, including Forbes. They are popularly known as The Facebook Papers.