Instagram has made a recent change you should be aware of. Any time you make a comment, you may get a popup warning that your comment may be “potentially offensive”.
That is, if the service’s AI-powered tools conclude that the warning is appropriate based on an analysis of other comments that have generated complaints.
Instagram isn’t taking a hard-nosed approach here. If you get such a notification, you’ll have the option to either edit it before posting or just post it as is.
In July 2019, the company introduced a similar tool for policing comments left on user posts, and this latest change builds on that toolset.
There’s a method to the company’s madness, and a broader purpose at work here. In October, Instagram added a “Restrict” feature that allows users to shadow ban bullies. Prior to that, it began rolling out AI bots that scanned content in an effort to proactively detect bullying in photos and captions, and other routines to filter offensive comments.
One thing that differentiates this latest feature from the others is that the company’s latest change is only available in “select countries” for now. The rest have been rolled out globally. Eventually that’s destined to change, but Instagram has not issued a statement for when that might happen.
Preventing cyberbullying doesn’t get as much attention as it should. Cyber bullying a real thing that leads to a tragic handful of deaths every year, and causes no end of pain and suffering to the targets of such behavior. So far, most companies haven’t been paying much more than lip service to doing something about it, so kudos to Instagram for taking this series of steps. While it remains to be seen how effective these efforts will be, the fact that the company is doing something is praiseworthy.
Leave Your Comments