YouTube will send out warnings to users posting violent and abusive comments

In a bid to improve the comments experience for everyone, YouTube has made an important change in its Comment Spam & Abuse policy. The video-streaming service hereafter will send a warning to users if it discovers the comments posted on its platform are violating community guidelines and therefore needs to be removed.

Google believes that its comment removal warning will discourage users from posting negative comments and reduce the number of further instances of them leaving any violative comments. However, if the person continues to follow the same bad behavior, then they may be stopped from posting comments for up to 24 hours or until a specific time limit has lapsed.

As reported by Google, during the trial run, the results of Comment removal warnings & timeouts were encouraging. It helped protect creators from users trying to negatively impact them via comments.

Currently, the new notification system is only available for English comments. Google plans to bring it to more languages in the coming months. The company also requests users to provide feedback if they think they’ve been mistakenly targeted or wrongly picked up by its comment system.

In addition to this change, Google has also improved spam detection on comments and has managed to remove over 1.1 billion spammy comments in the first six months of 2022 alone. Plus, it has improved spambot detection to keep bots out of live chats.

Report a problem with article
Next Article

Microsoft reportedly plans to release Windows 11 Moment 3 update in May 2023

Previous Article

Download this free eBook on AppSec Best Practices