Things You Should Know About Hate Speech in Any YouTube Video

YouTube Video

YouTube is changing day by day as it needs to adapt to the social and cultural terms of 2019. And while it’s no surprise that the video-sharing platform becomes more and more prone to censorship, however, the question is if all these changes are really needed or not. Before we address that point, let’s note that compared to recent changes in how ads are displayed, these new regulations should come with a positive outcome.

How Is Hate Speech Defined in a YouTube Video?

Well, it’s quite interesting on how YouTube is going to determine what’s hate speech and what is not. Considering the sheer size of the platform and the amount of new content posted each day, one could even say it’s impossible. No matter if we refer to the most viewed YouTube video or one that has reached 10 people, the platform shouldn’t be used for spreading negative ideas. Recently, YouTube has stated to target hate speech and any YouTube video containing supremacist views, racial attacks of any kind and even the denying of well-documented violent acts, like the Holocaust. YouTube policy makes it easier for other users to flag the content as hate speech. However, as they admitted, these videos should be stopped before reaching the audience. So, the next time you’re brainstorming for YouTube video ideas, try to make sure it won’t go against the YouTube privacy policy, possibly having been updated by then.

Definitely Not the Last Change We’ll See

This approach regarding hate speech isn’t a singular event since it comes after a period filled with double-checking the Terms and Conditions in these ever-changing times. Back in April, YouTube was forced to change the harassment policy after some complaints of serious harassment cases happening between various content creators with some pull on the public.

Can It Be That YouTube Is Too Big for Its Own Good?

Referring strictly to YouTube, there have been all kinds of slip-ups from the platform in recent times. This brought a lot of pressure to enforce and reform its policies, including the ones mentioned above. However, another YouTube video has attracted much controversy regarding YouTube having any control over the content posted. Back in March, the Alphabet-owned company struggled really hard to keep copies of videos depicting mass-shootings at mosques in New Zealand off its platform. Even if it was one of the most disliked videos on YouTube, it still was plenty of time online, even shared on other channels.

Everyone agrees that stricter control over the content posted on YouTube and other social media platforms is needed. With that in mind, YouTube also announced it could make significant changes to the content addressed to children after a federal investigation. This could mean that we would have an easier time when we wanted to remove video from YouTube. Also, Facebook announced it would remove all self-harm images on the platform to limit exposure on sensitive subjects like suicide or abuse of any kind.

What do you think? Should YouTube continue to improve its policies regarding sensitive subjects like harassment, hate speech of self-harm? Or should the Internet be left without censorship of any kind since it was designed as a free environment? We’d be glad to hear your opinion on this very controversial aspect!

Leave a Reply

Your email address will not be published. Required fields are marked *