Twitter tests telling users their tweet replies may be offensive

When users hit “send” on their reply, they will be told if the words in their tweet are similar to those in posts that have been reported, and asked if they would like to revise it or not.

Twitter has long been under pressure to clean up hateful and abusive content on its platform, which are policed by users flagging rule-breaking tweets and by technology.

“We’re trying to encourage people to rethink their behavior and rethink their language before posting because they often are in the heat of the moment and they might say something they regret,” Sunita Saligram, Twitter’s global head of site policy for trust and safety, said in an interview with Reuters.