Recently, movements like #MeToo and #TimesUp have shown the power of social media to engage people in a global conversation about women’s rights. But as we see the rise of women standing together, we also see an increase in online violence and abuse against women. Online violence against women is an expression of entrenched gender inequality across society. It has found a particularly fertile home on social media, where people feel emboldened to target women in ways that are inconceivable in face-to-face interaction. The sheer volume of abuse has reached alarming levels and it’s clear not enough is being done to tackle it.
Over the past 16 months, Amnesty International has been researching online violence and abuse against women. Our research shows that this widespread abuse deeply affects women’s rights to freedom of expression and equal participation in society. The conclusion is that social media, where trolls can too easily get away with spouting abusive, violent and sexualised threats, has become a toxic place for women.
How abuse leads women to self-censor or quit
Twitter is one of the largest social media companies in the world and was repeatedly highlighted by the women we had been speaking with in the US and the UK as a particularly toxic site, due to the fast pace that content - including violent and abusive content - can spread.
Through in-depth interviews with journalists, politicians, activists, artists and public figures, as well as women without large Twitter followings, we were able to unpick the disturbing extent of online violence and abuse that exists against women on the platform. Threats of rape and death, homophobic, racist and transphobic slurs were all abuses that were repeatedly highlighted by the women we spoke with. And it’s clear that receiving such abuse can have a deep psychological impact, with many women feeling they had to self-censor or quit the platform altogether for fear of abuse.
Many women Amnesty spoke with described how they had reported multiple tweets to Twitter with very few receiving a response. One UK journalist told Amnesty that she reported 100 abusive tweets, of which Twitter removed just two. On numerous occasions, women told Amnesty how the content of abusive tweets they reported was said "not to be in breach of Twitter’s community standards".
It’s clear that Twitter is currently failing to do enough to tackle the issue and ensure all users can realise their right to freedom of expression whilst using the platform. The company is failing to let users know how it interprets and enforces its policies, or how it trains content moderators to respond to reports of violence and abuse, and its response to abuse is inconsistently enforced. Often, reports of abuse are not responded to at all.
Not everyone experiences violence and abuse in the same way – there isn’t a unified ‘woman’s’ experience of Twitter trolling. Women are targeted in different ways according to their gender, race, class, sexuality and other parts of their identity which intersect to create unique experiences of online violence. In practice, this means that some women experience abuse on multiple grounds because of their intersecting identities.
The intersectional nature of online violence against women emerged starkly in Amnesty’s monitoring of online violence and abuse against women MP candidates in the UK General Election last June. Using a mix of computer analysis and in-depth interviews with women MPs, we uncovered the extent of violence and abuse they face on Twitter. For example, Ruth Davidson told us about the homophobic abuse she receives and how she lost faith in the reporting process after failing to see action from Twitter. She also shared how the deluge of abuse makes it challenging to engage in genuine conversation on Twitter.
We also found that while many women from across all the political parties experienced online abuse, Diane Abbott MP received almost half (45.14%) of all abusive tweets in the six weeks leading up to the 8th June election. The abusive tweets she received targeted both her gender and race through misogynistic and racist language. Asian women MPs received 30% more abusive tweets per MP compared to white women MPs, even though they represent just 8.8% of all MPs. Reading through the tweets they received clearly illustrated how a woman’s intersecting identities are targeted by online trolls.
What Twitter needs to do
Amnesty’s report concludes that Twitter should develop and implement a human rights due diligence process to identify, prevent and remedy human rights violations. It needs to assess - on an ongoing and proactive basis - whether its policies are fit for purpose. Given the nature of the violence and abuse taking place on Twitter, it is necessary for due diligence to be informed by gender analysis, as well as analysis of other identity based human rights violations.
Twitter also needs to be more transparent and share comprehensive and meaningful data about the nature and levels of violence and abuse against women - as well as other groups - on the platform, and how the company responds to it.
And Twitter should undertake far more proactive measures in educating users and raising awareness about security and privacy features on the platform that will help women create a safer and less toxic Twitter experience.
While our research concentrates on Twitter, states too have obligations to prevent online violence against women. We welcome the Law Commission’s review into trolling laws commissioned by the Government and recommend that it interrogates whether different groups of women at the receiving end of online violence and abuse because of their gender and other parts of their identity are protected by the law.
In addition to legislative reform education is critical to tackle violence against women: we welcome the Government’s commitment to introduce sex and relationship education in schools and recommend it includes the online environment.
The future of tackling violence against women must consider how it plays out online, and companies like Twitter have a key role to play in protecting human rights. Social media companies urgently need to improve their analysis of online violence, and they must put the necessary resources in place so that they can properly enforce their own rules to prevent the abuse. If they don’t, they risk further silencing women online.
Chiara Capraro is Women’s Human Rights Programme Manager at Amnesty International UK