YouTube purges 17,000 hate speech channels after policy change



YouTube this week issued an update on its efforts to remove hate speech from its platform — and it would appear it’s managed to make some headway. It claims to have removed over 100,000 videos and 17,000 channels specifically over hate speech, which is both good and kind of terrifying when you think about it.

The hate speech policies changed earlier this year, with YouTube “specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.” It also added it’d be removing videos glorifying Nazi ideology, and those that denied “well-documented violent events” like Sandy Hook or the Holocaust.

That was in June 2019, so better late than never, right?

If you want to look at it another way, it’s kind of horrifying to contemplate that YouTube had 17,000 channels with blatant enough hate speech that the site was able to identify and remove them. Many have complained about YouTube’s slipshod approach to policing its platform. Just before the new policy was announced, the New York Times published a report about how YouTube’s algorithms sexualized children by recommending videos of them partially clothed to users who’d previously watched similar videos, which didn’t exactly engender confidence in the platform’s algorithms.

Google and YouTube executives have in the past attempted to defend the site’s laxness with regards to objectionable content by pointing out how large the site is — Google CEO Sundar Pichai gave an interview to Axios at the time the new hate speech policies rolled out and said, “YouTube has the scale of the entire Internet.” YouTube apparently has a team of 10,000 people reviewing videos for objectionable content, and I have no doubt the team could be even bigger and everyone would still have a full docket.

It’s also worth remembering the circumstances under which this hate speech policy was put into effect. In addition to the aforementioned report about child safety, YouTube was also caught in a kerfuffle over alleged hate speech targeted towards a journalist. At first the company insisted the speech didn’t violate its policies, but later reversed course. It was grilled especially hard over this considering the alleged hate speech in question was homophobic in nature and this all happened in the middle of Pride Month. So the hate speech policies were welcomed, but didn’t exactly seem to address the immediate problem.

Regardless, YouTube also touts that it’s improved on its ability to remove hateful videos before they reach a broad audience. As an example of how effective it’s team has been, its announcement reads, “…the nearly 30,000 videos we removed for hate speech over the last month generated just 3% of the views that knitting videos did over the same time period.”

That feels like a dig at knitting videos. As a consumer of knitting videos, I’m not sure if I should be annoyed or not.

I’m sure YouTube’s got a fair bit to go if it hopes to stamp out hate speech on the platform entirely. But still, 17,000 channels isn’t anything to sneeze at. There’s always the fear the company could sweep up non-hateful channels in the purge — not to mention that trying to quash hate speech is frequently akin to a game of Whac-a-mole — but all things considered, that’s a risk I know I’m comfortable with.


The Four Rs of Responsibility, Part 1: Removing harmful content
on YouTube Blog



Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version