In an effort to curb cyber bullying, Instagram is rolling out a new AI feature that will automatically detect whether comments are offensive and notify users before they are posted. The Hill reports: In an example included in the company release, Instagram shows a user trying to comment “You are so ugly and stupid.” Instagram follows up with a message asking the user “Are you sure you want to post this?” with an “undo” button. “From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect,” Instagram said.
To further help protect users from unwanted interactions, Instagram said it will start testing a new “restrict” feature. Restricting a user will make it so the user’s comments are only visible to that person; a user will be able to choose whether or not to make that the restricted person’s comments available to others by approving them. Restricted users also will not be able to see when an account is active or when a person has read their direct messages.