Facebook extends hate speech ban to include white nationalism
Facebook announced Wednesday it is broadening its definition of banned hate speech and taking action against white nationalism and white separatism. The social media giant said that those concepts are linked to organized hate groups and, "have no place on our services." Facebook had previously banned posts endorsing white supremacy.
The move comes less than two weeks after a white supremacist live-streamed his deadly attack at a New Zealand mosque. Facebook hopes its new policy will help prevent the promotion of hateful content, reports CBS News' Jeff Pegues.
Starting next week, Facebook and Instagram will ban "praise, support and representation of white nationalism and separatism," saying "it's clear that these concepts are deeply linked to organized hate groups."
"There was a huge gaping hole that allowed violent white supremacists, and neo-Nazis and racists to exploit the Facebook platform," said civil rights attorney Kristen Clarke.
Self-avowed white nationalists used Facebook to organize the Unite the Right rally in Charlottesville where James Fields ran down and killed a counter-protester in 2017. He pleaded guilty to committing a hate crime Wednesday. Facebook said it has been cracking down on hate speech. Last year, the company says it took action on nearly eight million pieces of content that violated its rules.
"The volume need not stop them from doing their job and doing their part to make sure that they're standing up to hate," Clarke said. "It is so critical that we get Facebook and other companies across the tech sector to do their part. We must hold them accountable."
On Wednesday, Facebook acknowledged it needs "to get better and faster at finding and removing hate from our platforms." It says the artificial intelligence it uses to find material from terrorist groups will now be used for a broad range of hate groups.
YouTube and Twitter both say their user agreements also prohibit violent and hateful content.