Page 1 of 1

The dark side of Guardian comments: As part of a series on t

PostPosted: Thu May 19, 2016 12:57 am
by admin
The dark side of Guardian comments: As part of a series on the rising global phenomenon of online harassment, the Guardian commissioned research into the 70m comments left on its site since 2006 and discovered that of the 10 most abused writers eight are women, and the two men are black. Hear from three of those writers, explore the data and help us host better conversations online
by Becky Gardiner, Mahana Mansfield, Ian Anderson, Josh Holder, Daan Louter and Monica Ulmanu
April 12, 2016

NOTICE: THIS WORK MAY BE PROTECTED BY COPYRIGHT

YOU ARE REQUIRED TO READ THE COPYRIGHT NOTICE AT THIS LINK BEFORE YOU READ THE FOLLOWING WORK, THAT IS AVAILABLE SOLELY FOR PRIVATE STUDY, SCHOLARSHIP OR RESEARCH PURSUANT TO 17 U.S.C. SECTION 107 AND 108. IN THE EVENT THAT THE LIBRARY DETERMINES THAT UNLAWFUL COPYING OF THIS WORK HAS OCCURRED, THE LIBRARY HAS THE RIGHT TO BLOCK THE I.P. ADDRESS AT WHICH THE UNLAWFUL COPYING APPEARED TO HAVE OCCURRED. THANK YOU FOR RESPECTING THE RIGHTS OF COPYRIGHT OWNERS.


Comments allow readers to respond to an article instantly, asking questions, pointing out errors, giving new leads. At their best, comment threads are thoughtful, enlightening, funny: online communities where readers interact with journalists and others in ways that enrich the Guardian’s journalism.

But at their worst, they are something else entirely.

The Guardian was not the only news site to turn comments on, nor has it been the only one to find that some of what is written “below the line” is crude, bigoted or just vile. On all news sites where comments appear, too often things are said to journalists and other readers that would be unimaginable face to face – the Guardian is no exception.

New research into our own comment threads provides the first quantitative evidence for what female journalists have long suspected: that articles written by women attract more abuse and dismissive trolling than those written by men, regardless of what the article is about.

Although the majority of our regular opinion writers are white men, we found that those who experienced the highest levels of abuse and dismissive trolling were not. The 10 regular writers who got the most abuse were eight women (four white and four non-white) and two black men. Two of the women and one of the men were gay. And of the eight women in the “top 10”, one was Muslim and one Jewish.

And the 10 regular writers who got the least abuse? All men.

How should digital news organisations respond to this? Some say it is simple – “Don’t read the comments” or, better still, switch them off altogether. And many have done just that, disabling their comment threads for good because they became too taxing to bother with.

But in so many cases journalism is enriched by responses from its readers. So why disable all comments when only a small minority is a problem?

At the Guardian, we felt it was high time to examine the problem rather than turn away.

We decided to treat the 70m comments that have been left on the Guardian – and in particular the comments that have been blocked by our moderators – as a huge data set to be explored rather than a problem to be brushed under the carpet.

This is what we discovered.

To date, 1.4 million comments (2% of the total) have been blocked by Guardian moderators because they violated the Guardian’s community standards. Most of these are abusive to some degree (they may use insulting language, or be ad hominem attacks) or are so off-topic that they derail the conversation.

1 of 6
To see if men and women were treated differently by commenters, we began by classifying the authors of the articles by gender. While the number of articles published increased over time, the writers’ gender gap stayed pretty much the same, as it has in most media organisations.

Image

2 of 6
This gender gap is bigger in some sections than others. Sport had the smallest proportion of articles written by women writers, but World News and Technology were not far behind. The only section that had significantly more articles written by women was Fashion.

Image

3 of 6
Articles written by women got more blocked (ie abusive or disruptive) comments across almost all sections. But the more male-dominated the section, the more blocked comments the women who wrote there got (look at Sport and Technology). Fashion, where most articles were written by women, was one of the few sections where male authors consistently received more blocked comments.

Image

4 of 6
Another way of looking at this, is that since around 2010 articles written by women consistently attracted a higher proportion of blocked comments than articles written by men.

Image

5 of 6
Some sections attracted more blocked comments than others. World news, Opinion and Environment had more than the average number of abusive or disruptive comments. And so did Fashion.

Image

6 of 6
We also found that some subjects attracted more abusive or disruptive comments than others. Conversations about crosswords, cricket, horse racing and jazz were respectful; discussions about the Israel/Palestine conflict were not. Articles about feminism attracted very high levels of blocked comments. And so did rape.

Image

We focused on gender in this research partly because we wanted to test the theory that women experience more abuse than men. But both writers and moderators observe that ethnic and religious minorities, and LGBT people also appear to experience a disproportionate amount of abuse.

What do we mean by ‘abuse’?: 'Imagine going to work every day and walking through a gauntlet of 100 people saying "You're stupid", "You're terrible", "You suck", "I can't believe you get paid for this". It's a terrible way to go to work'
by Jessica Valenti
Guardian writer

Image

On the Guardian, commenters are asked to abide by our community standards, which aim to keep the conversation respectful and constructive – those that fall foul of those standards are blocked. The Guardian’s moderators don’t block comments simply because they don’t agree with them.

The Guardian also blocks comments for legal reasons, but this makes up a very small proportion of blocked comments. Spam is not blocked (ie replaced by a standard moderator’s message) but deleted, and is not included in our findings; neither are replies to blocked comments, which are themselves automatically deleted.

The vast majority of blocked comments, therefore, were blocked because they were considered abusive to some degree, or were otherwise disruptive to the conversation (they were off-topic, for example). For the purposes of this research, therefore, we used blocked comments as an indicator of abuse and disruptive behaviour. Even allowing for human error, the large number of comments in this data set gave us confidence in the results.

But what do we mean by abuse and disruptive behaviour?

At its most extreme, online abuse takes the form of threats to kill, rape or maim. Thankfully, such abuse was extremely rare on the Guardian – and when it did appear it was immediately blocked and the commenter banned.

Less extreme “author abuse” – demeaning and insulting speech targeted at the writer of the article or another comment – is much more common on all online news sites, and it formed a significant proportion of the comments that were blocked on the Guardian site, too.

Here are some examples: a female journalist reports on a demonstration outside an abortion clinic, and a reader responds, “You are so ugly that if you got pregnant I would drive you to the abortion clinic myself”; a British Muslim writes about her experiences of Islamophobia and is told to “marry an ISIS fighter and then see how you like that!”; a black correspondent is called “a racist who hates white people” when he reports the news that another black American has been shot by the police. We wouldn’t tolerate such insults offline, and at the Guardian we don’t tolerate it online either.

The Guardian also blocked ad hominem attacks (on both readers and journalists): comments such as “You are so unintelligent”, “Call yourself a journalist?” or “Do you get paid for writing this?” are facile and add nothing of value to the debate.

“Dismissive trolling” was blocked too – comments such as “Calm down, dear”, which mocked or otherwise dismissed the author or other readers rather than engaged with the piece itself.

We know that abuse online isn’t always aimed at individuals. Hate speech as defined by law was rarely seen on Guardian comment threads (and when it did appear it was blocked and the commenter banned). But xenophobia, racism, sexism and homophobia were all seen regularly. Take for example, some of the comments left below an article on the mass drownings of migrant men, women and children in the Mediterranean: “These people contribute nothing to the countries they enter”; “The more corpses floating in the sea, the better”; “LET THEM ALL DROWN!” At the Guardian, comments like these are considered abusive and were blocked from appearing on the site.

The Guardian also blocked comments that would otherwise disrupt or derail the debate: “whataboutery” of various kinds, or remarks that are clearly off-topic. While not abusive in themselves, such comments serve to make a constructive debate impossible, and show a lack of respect to the journalist and to other commenters in the thread.

Sometimes moderation decisions are easy, other times it can be difficult to know where to draw the line. All are based on the Guardian’s community standards, not moderators’ personal tastes and opinions.

Which comment would you block? Play the moderator role and take our quiz to see how your decisions compare to those of Guardian moderators

1 of 8 In an opinion piece about what makes one a "feminazi"

“Funny how so many journalists are female, and how many are feminists! A disproportionate number pollute journalism. Jusrt shows that men DO tend to do 'harder' jobs than keyboard bashing, while the technology that men designed and built is used to provide these harpies with a medium from which to spout their biased, sexist, hateful misandry.”

Allow Block

***

What harm is done?: 'Even if I tell myself that somebody calling me a nigger or a faggot doesn't mean anything, it has a toll on me: it has an emotional effect, it takes a physical toll. And over time it builds up'
by Steven Thrasher
Guardian writer

Image

At the Guardian, readers and journalists can report abusive or off-topic comments, and moderators will quickly block them if they break the community standards. Moderation minimises the harm done by abuse that is posted on the site.

But for journalists, abuse is rarely confined to the site on which their work appears, and on some sites and social media platforms it can be very hard to get abusive posts removed. So for them, the abuse they receive below the article they have written is not experienced in isolation: each snarky comment, each spiteful tweet, is (as Zoe Quinn once put it) just one snowflake in an avalanche.

And avalanches happen easily online. Anonymity disinhibits people, making some of them more likely to be abusive. Mobs can form quickly: once one abusive comment is posted, others will often pile in, competing to see who can be the most cruel. This abuse can move across platforms at great speed – from Twitter, to Facebook, to blogposts – and it can be viewed on multiple devices – the desktop at work, the mobile phone at home. To the person targeted, it can feel like the perpetrator is everywhere: at home, in the office, on the bus, in the street.

People who find themselves abused online are often told to ignore it – it’s only words; it isn’t real life. But in extreme cases, that distinction breaks down completely, such as when a person is doxed, or SWATed, when nude photos are posted of the person without consent, or when a stalker assumes the person’s identity on an online dating site and a string of all-too-real men appear at their door expecting sex. As one woman who had this experience said: “Virtual reality can become reality, and it ruins your life.”

But in addition to the psychological and professional harm online abuse and harassment can cause to individuals, there are social harms, too. Recent research by the Pew Centre found that not only had 40% of adults experienced harassment online but 73% had witnessed others being harassed. This must surely have a chilling effect, silencing people who might otherwise contribute to public debates – particularly women, LGBT people and people from racial or religious minorities, who see others like themselves being racially and sexually abused.

Is that the kind of culture we want to live in?

Is that the web we want?

***

How can we create the web we want?: ‘I think it is a worthy venture to keep comments open, even if you don't like what readers are saying or how they are saying it. Journalists need to be challenged’
by Nesrine Malik
writer and commentator

Image

Even five years ago, online abuse and harassment were dismissed as no big deal. That is not true now. There is widespread public concern, and more support for anti-harassment proposals. But no one is pretending that this is an easy problem to fix – not on the Guardian’s comment threads, where most commenters are respectful, and where there is already a high level of moderation, and certainly not elsewhere on the web as a whole, where there are sometimes no safeguards at all.

The Guardian is committed to tackling the problem. This research is a part of that: an attempt to be open, and to share publicly what has been discovered. We hope to do more research to dig deeper into the problem, and to discover not only what can cause online conversations to go awry, but also what media organisations can do to help make those conversations better, and more inclusive.

The Guardian has already taken the decision to cut down the number of places where comments are open on stories relating to a few particularly contentious subjects, such as migration and race. This allows moderators to keep a closer watch on conversations that we know are more likely to attract abuse.

However, unlike many news sites, the Guardian has no plans to close comments altogether. For the most part, Guardian readers enrich the journalism. Only 2% of comments are blocked (a further 2% are deleted because they are spam or replies to blocked comments); the majority are respectful and many are wonderful. A good comment thread is a joy to read – and more common than the “don’t read the comments” detractors believe.

As Prof Danielle Keats Citron argues in her book, Hate Crimes in Cyberspace, abusive behaviour is neither normal nor inevitable. Where it exists, it is a cultural problem that, collectively, we must try to solve using all the means at our disposal: technological and social.

Which is where you come in. We want to hear from Guardian readers: when it comes to providing a space where everyone feels able to participate, what is the Guardian doing right, and how could we improve? Please take a moment to tell us here.