Amnesty’s #ToxicTwitter report calls for rules on abuse to be enforced

Twitter was founded 12 years ago today and Amnesty have launched their #ToxicTwitter report to remind the social media giant that they need to put more substance behind their recent statement that they would “stand with women around the world to make their voices heard and their presence known”.

Amnesty have collected interviews and testimony from women who experience violent threats, sexism and racism on a daily basis. Often it’s the already marginalised who experience the most intense abuse on Twitter (as well as other social media platforms).

One of Amnesty’s contentions is that “Twitter isn’t putting their own rules on abuse and hateful conduct into practice, or showing us how they deal with abuse when it’s reported”.

Women in Northern Ireland were among the 86 women who provided testimony from the UK and US.

Irish News journalist Allison Morris explained:

“Some of the things that have been put on Twitter about me have had people say they know where I live, I’ve had people say that they’ll be outside my work, I’ve had people not just threaten me but also say things that, you know, are clearly veiled threats against my family.”

Working with social media as well as just consuming it, I have long been aware that women are as likely to receive comments about their appearance as the content of what they say and do online, and that armchair social media warriors often suggest vile things that they wouldn’t dare say in a face-to-face setting.

However, sitting behind the camera filming the local interviews for Amnesty NI, I was genuinely shocked by the level, tone and volume of abuse that the participants received. It was appalling, and at times moving to witness the emotional impact of the abuse received online. While some people turn off notifications to limit their exposure to the stream of hateful content, many still needed to use Twitter for work purposes and would have to wade through the replies to see genuine messages.

In most cases, the interviewees spoke about social media platforms’ rules and guidelines being theoretically sufficient. However, the practice of implementing and enforcing the rules was very unsatisfactory.

Alliance party leader Naomi Long summed it up:

“In respect of reporting abuse on Twitter, I think it’s a wholly ineffective process, I have to be honest. I have reported abuse and unless it is very specific … they don’t take it seriously.”

Reporting content that on face value breaches the Twitter Rules had minimal effect. Some had no success is getting content removed. Some had minimal success. Some had long since given up even trying to report given how fruitless the process was. Twitter should find the lack of engagement disturbing as it points to an ineffective and broken process.

This wasn’t just the case with Twitter: Facebook’s enforcement of their community standards were also described as poor. The same vile image was posted on two different Facebook pages. Reporting both images saw one being removed for being in breach of community standards, and the other report being rejected as the content was not in breach.

Trans-activist and former election candidate Ellen Murray explained:

“Over and above it’s marginalised groups and minority groups who experience abuse on Twitter and other social media platforms and for people to feel welcome and people to be legitimately able to use those services, they need to be able to get … recourse to justice.”

Alan Meban. Normally to be found blogging over at Alan in Belfast where you’ll find an irregular set of postings, weaving an intricate pattern around a diverse set of subjects. Comment on cinema, books, technology and the occasional rant about life. On Slugger, the posts will mainly be about political events and processes. Tweets as @alaninbelfast.