The UK’s digital regulator, Ofcom, has published new findings that suggest a third of people who accessed video-sharing websites such as YouTube came across hateful content in the last three months suggesting content policing may not be working. The new findings were released to coincide with new rules, published by Ofcom, which video-sharing platforms (VSPs) must comply with.
Ofcom’s study found that a third of users had found hateful content online; the regulator said the content was normally directed towards certain racial groups, religious groups, transgender people and according to sexual orientation.
Beyond that content, a quarter of those asked said they had been exposed to bullying, abusive behaviour and other threats. A fifth of respondents said that they had witnessed or experienced racist content online and those from a minority ethnic background were more likely to have encountered this content.
As younger people tend to be more adept with technology, its unsurprising to hear from Ofcom that 13- to 17-year-olds were more likely to have been exposed to harmful content online in the last three months. Seven in ten of VSP users who responded said they came across harmful content but this rose to eight in ten among 13- to 17-year-olds.
The regulator also found that 60% of VSP users that responded were unaware of the safety and protection features on the websites they use and only 25% have ever flagged or reported content they thought was harmful. To help raise awareness, Ofcom has told VSPs that they need to introduce clear upload rules, make it easy to flag or report content, and it said that adult sites should introduce age-verification systems.
If sites fail to comply with Ofcom’s decisions, it will investigate and take action. Some of the measures it could enforce include fines, requiring a provider to take specific actions, and in serious cases, it could restrict access to the service.
18 Comments - Add comment