Online bullying has been a perennial problem across the web, and Instagram took steps late last year to help address this issue by launching a tool to identify bullying in photos posted on its platform. Today, the photo-sharing service announced the rollout of a new feature meant to expand the scope of its anti-bullying effort to comments made by users.
The new tool is powered by artificial intelligence and it works by detecting offensive comments and notifying users about it before posting in hopes of convincing them to change their statement into a less hurtful one. The feature also blocks notifications about abusive comments from reaching the recipient.
Instagram says the feature showed signs of positive results during the early stage of testing, with most people undoing their derogatory comments after having been given the chance to weigh in on them.
If that feature isn't enough, Instagram also plans to introduce another tool designed to empower the victims of online bullying. In the near future, the Facebook-owned service will be testing "Restrict," a feature that will let users isolate their bully from their conversation without letting them know about it. The upcoming tool will limit the visibility of a restricted comment only to its owner, though users have the option to make it visible to others as well.
In addition, restricted users won't see your online availability on Instagram and they will have no way of knowing when you've read their direct messages. It's not clear when the company plans to roll out Restrict, but it will be a welcome development to its users, especially the teens, at a time when online bullying is becoming rampant, according to a recent study by Pew Research Center.