When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.

YouTube will manually vet its most popular channels for offensive content

YouTube soon plans to manually review content uploaded by its most popular channels with the help of human moderators as well as AI in order to flag videos inappropriate for advertisements, reports Bloomberg citing people familiar with the matter.

The process will be applied to channels part of Google Preferred, a group of YouTube’s top-tier channels where ad spots are offered to advertisers at a higher price due to their reach and popularity. The company announced last year that it aims to assign over 10,000 of its employees to the task of manually auditing content for policy violations this year.

It is perhaps the chain of events that transpired over the past few months that have led YouTube into making this move. PewDiePie’s scandalous conduct in September last year led Google to cancel his YouTube Red series, in spite of the YouTuber issuing a formal apology; more recently, YouTuber Paul Logan found himself in trouble following an insensitive video that he uploaded. YouTube canceled his Red shows as well.

In a statement to Bloomberg, a Google spokesperson made the company’s objective apparent:

"We built Google Preferred to help our customers easily reach YouTube’s most passionate audiences and we’ve seen strong traction in the last year with a record number of brands. As we said recently, we are discussing and seeking feedback from our brand partners on ways to offer them even more assurances for what they buy in the Upfronts."

In the advertising industry, upfront events are held by networks and are attended by major advertisers, with the goal of purchasing and reserving advertising spots “up front,” several months before they would begin to air. As Google put it in a 2015 blog post, “Upfronts exist to stir up enthusiasm for the medium, help marketers plan campaigns, and tantalize buyers in anticipation of the season’s trading battles to come.”

YouTube’s motivations seem quite clear: the intention is to sanitize its most popular content and provide advertisers an assuredly uncontentious spot where they could advertise; manual review of its most popular content, then, seems like an obvious solution. It’s certainly better than what its machine-learning algorithms are doing, at least at the moment.

Report a problem with article
Next Article

Intel's fix for Meltdown and Spectre had a bug of its own

Previous Article

Microsoft finally starts pushing January firmware updates for Surface line

Join the conversation!

Login or Sign Up to read and post a comment.

18 Comments - Add comment