YouTube removed over eight million videos in last quarter of 2017

YouTube released a transparency report on how it is enforcing its community guidelines, which do not allow contents related to "pornography, incitement to violence, harassment, or hate speech", for example. With the help of machine learning algorithms, the company announced it removed almost 8.3 million videos from its platform in the period covering October to December of 2017.

According to a blog post announcing the release of the report, almost 6.7 million of those videos were first flagged by machines, with 76 percent of that amount removed before anyone watched them. The employment of machine learning algorithms has helped improve the response time for removing abusive content drastically since the introduction of such techniques in June 2017, as can be seen on the chart below for videos flagged for containing violent extremism.

But humans have also flagged more than 9.3 million videos in the same period, most of the time due to sexual-related content, spam, or hateful and violent content. Also, almost 95 percent of the human-made flagging was done by common users, with much of the remainder coming from a group YouTube calls Trusted Flaggers.

YouTube is also introducing a new tool, the Reporting History dashboard, so users can follow the status of videos they have flagged as the company reviews them against the Community Guidelines. The new dashboard is already available and also accounts for videos that may be age-restricted instead of completely taken down after the review process.

Sources: YouTube (1) (2) via The Guardian

Report a problem with article
Next Article

Snapdragon 636-powered Asus ZenFone Max Pro M1 launched in India

Previous Article

You can finally use PayPal through Samsung Pay