Twitter will prompt users to reconsider sending offensive messages, as it tries to clean up conversations on the platform.
The company announced on Tuesday that it would be “running a limited experiment on iOS” where you can revise your reply to a tweet before it is published, if they see that it contains language that could be harmful.
In an interview with Reuteurs, Sunita Saligram, Twitter’s global head of site policy for trust and safety, said: “We’re trying to encourage people to rethink their behavior and rethink their language before posting because they often are in the heat of the moment and they might say something they regret.”
Twitter’s policies do not allow users to target individuals with slurs, racist or sexist tropes, or degrading content, but the company has been criticised for allowing this content to exist.
The company took action against almost 396,000 accounts under its abuse policies and more than 584,000 accounts under its hateful conduct policies between January and June of last year, according to its transparency report.
“When things get heated, you may say things you don’t mean,” Twitter said in a statement.
“To let you rethink a reply, we’re running a limited experiment on iOS with a prompt that gives you the option to revise your reply before it’s published if it uses language that could be harmful.” – READ MORE
Listen to the insightful Thomas Paine Podcast Below --