YouTube will remove thousands of videos advocating extremist ideologies such as neo-Nazism and white supremacism. This effort aims to eliminate discriminatory or hateful content.
“We specifically prohibit videos that affirm the superiority of a group in order to justify discrimination, segregation or exclusion based on criteria such as age, gender, race, caste, religion, orientation, ” the company outlined n aÂ blog post.
The new rules came into effect on Wednesday, but “it will take time for our systems to upgrade, and we will gradually expand coverage in the coming months,” reads the article.
“That would include, for example, videos that promote or glorify the Nazi ideology, which is inherently discriminatory,” says one. The article also announces that YouTube will remove even the contents “denying the existence of violent events of which the reality is recognized, like the Holocaust or the shooting at the Sandy Hook primary school”.
No specific channels or videos bearing the cost of this new policy have been named by YouTube.
Social networks under pressure
Like other social networks, YouTube is criticized for not doing enough to quickly remove content that promotes hatred or violence or peddling conspiracy theories.
In parallel, some personalities, including US President Donald Trump, accuse social media platforms of censoring the right-wing views. The fact that some right and far right political commentators have been excluded from some social networks in recent months has fueled these concerns of censorship.
“The context is important”
YouTube says it is aware of the potential value of deleted videos for researchers or non-governmental organizations that want to understand hatred to better combat it. The company is exploring options for making this content available to them.
She also points out that “the context is important, and so some videos could stay online because they deal with topics such as legislation being passed that they condemn, denounce hatred or provide information. an analysis of the news “.
YouTube also announced that January restrictions on content in the United States that does not quite violate the rules, but is nonetheless problematic, will be extended to other countries in 2019.
These brakes have reduced the number of views of this type of content by 50%, says the blog.
Meanwhile, YouTube will encourage the reading of verified and serious content in its recommendations — by going back in the column on the right of the screen — to people who have watched content considered problematic.