Will The Kyle Rittenhouse Verdict Change How Facebook Treats Gun Violence—Again?
By Adam
Facebook was notified in August 2020 when Kyle Rittenhouse shot and killed two men while wounding a third. took relatively swift action. A day after the incident in Kenosha, Wisconsin, it removed his Facebook and Instagram accounts, started to prohibit posts praising him and blocked his name from the apps’ search function.
The moves came as part of another new Facebook policy around violence and mass shootings that debuted that same week, though it’s unclear whether it dropped befOre or Rittenhouse had shot the men. And as part of its decision to reduce Rittenhouse’s profile on the platform, the company officially designated Rittenhouse as a “mass shooter.”
The steps were immediately criticised by Facebook. In a post to Facebook’s internal Workplace message board days later, one employee wrote: “If Kyle Rittenhouse had killed 1 person instead of 2, would it still qualify as a mass shooting? Can we really consistently and objectively differentiate between support (not allowed) and discussion of whether he is being treated justly (allowed)?”
The Workplace post went on: “Can we really handle this at scale in an objective way without making critical errors in both under and over enforcement?”
The comment hits the mark. Facebook has spent years deliberating about the type and manner of content that it should regulate. The company has been criticized both by liberals and conservatives alike for being too busy. It is therefore pulled in two directions and it does not always please one side.
Recent pressure has been placed on the government to adopt a more strong stance against violence-inducing content. This might seem to be something that could attract universal support. It hasn’t. Facebook made it even worse on Friday when Rittenhouse was found not guilty by a jury. This sparked outcries among right-wing pundits who claimed that Facebook unfairly penalized him. (His lawyer successfully convinced him that he had acted in self defence that August evening in Kenosha. This was a city that was being protested over Jacob Blake’s police shooting.
Facebook has long been reluctant to make judgement calls about what belongs on its site—and when it has prohibited material such as violent content, it has not always been successful in keeping it from appearing on its platform. The most shocking example was the New Zealand shooting of March 2019, Christchurch. In this case, the gunman livestreamed the entire incident …read more
Source:: Social Media Explorer