By Caitlin Dewey
June 10, 2015
NOTICE: THIS WORK MAY BE PROTECTED BY COPYRIGHT
YOU ARE REQUIRED TO READ THE COPYRIGHT NOTICE AT THIS LINK BEFORE YOU READ THE FOLLOWING WORK, THAT IS AVAILABLE SOLELY FOR PRIVATE STUDY, SCHOLARSHIP OR RESEARCH PURSUANT TO 17 U.S.C. SECTION 107 AND 108. IN THE EVENT THAT THE LIBRARY DETERMINES THAT UNLAWFUL COPYING OF THIS WORK HAS OCCURRED, THE LIBRARY HAS THE RIGHT TO BLOCK THE I.P. ADDRESS AT WHICH THE UNLAWFUL COPYING APPEARED TO HAVE OCCURRED. THANK YOU FOR RESPECTING THE RIGHTS OF COPYRIGHT OWNERS.
1. No identifying information
2. No dissent / No being fat
3. Keep the peace
4. No links to other parts of Reddit
5. Absolutely NO FAT SYMPATHY
The posted rules for “Fat People Hate,” the largest forum Reddit banned today. (Internet Archive)
Reddit, the so-called “front page” of the Internet that 172 million people use monthly, has developed a reputation for allowing pretty much anything.
Creepshots of little kids? Check.
Even at a national memorial, no one is safe from ‘creepshots’
By Caitlin Dewey
October 10, 2014
A man who photographed women sitting on the steps of the Lincoln Memorial recently got away with it. (Photo by Jonathan Newton/The Washington Post)
Creepshots, one of the Internet’s many bizarre sexual scourges, are “repellent and disturbing,” a D.C. judge ruled Thursday — but they are not technically illegal.
In case you aren’t familiar with the term (and you might not want to be, FYI), creepshots are essentially just what they sound like: sneaky, surreptitious photos of a person’s, usually a woman’s, private areas, taken without her consent — and often, without her knowledge that the parts in question are even visible. They’re widely considered a genre of so-called “nonconsensual porn” — explicit images or videos traded without permission from the people they depict.
Online, these images enjoy a thriving trade on sites like Reddit, 4chan, AnonIB and even Twitter, where handles like @CreepBK, @SexySights and @Creep_daddy keep up a steady stream of skin-crawling photos. Both Reddit and AnonIB have forums dedicated to these types of photos, specifically; on AnonIB’s creepshot forum, users are currently salivating over a photo of a high-school student bending over to push a Wal-Mart cart as she shops. It’s obvious, from the number and location of the photos, that someone followed her around the store to take them.
“There were some pics of a similar looking girl kicking around the interweb a few years back,” one user wrote. “She was leaning over a barrier at her swimming pool talking to her swim coach … Anyone know the whereabouts?”
Someone else in the forum promptly supplied the photo: It is indeed of a (very young) girl at a swim meet, wearing what look to be swim shorts, watching people in the pool swim.
To many a reasonable observer, the existence of such a photo, and the intentions with which it was taken, seem self-evidently wrong. These are private people, going about their private lives. Why should a quick breeze or a bathing suit expose them to this kind of sustained, humiliating attention?
In the D.C. case, at least, the answer lay in a legal technicality. Voyeurism, a misdemeanor, has a very specific definition: You can’t secretly record someone using the bathroom, changing, having sex, or doing anything else where she has “a reasonable expectation of privacy.”
That reasonable expectation is what many cases hinge on. In this one, the creepshot-er in question, Christopher Hunt Cleveland, took his photos at the Lincoln Memorial as women sat on the steps. He didn’t sneak up under them, or use a peephole, or deploy any other similarly sneaky tricks. Instead, like the photographer who followed the woman in Wal-Mart, or the one who took pictures of the girl at the swim meet, he took pictures of public things, in a public place.
Never mind that leaning over for a moment, or wearing short shorts, does not in any way constitute a consent to be photographed. Never mind that, as U.S. Attorney Akhi Johnson argued, women have a reasonable expectation of privacy just by virtue of wearing clothes.
In purely legal terms, Cleveland’s creepshots don’t meet that standard. And so all charges were dropped against him, even as the stakes for women who visit the memorial were raised. Consider the implications of this for a second: If you were to visit the National Mall this weekend, and someone began taking photos of you or your children, there is nothing you can do about it.
It is “repellent and disturbing,” but it is his right.
There are attempts to change that, of course. In March, after a Massachusetts court ruled that upskirt photos were legal in the state, the legislature quickly pushed through a bill to criminalize it. Creepshots in the Bay State are now punishable by as many as five years in jail or fines up to $5,000. Texas also recently passed a law against photographs taken “with the intent to arouse or gratify” sexual desire, though that was later struck down on First Amendment grounds.
Which means that — in Texas, as in D.C., as in much of the country — women can be photographed, objectified, and have the photos passed around the Web simply for the crime of leaving their houses. Repellent and disturbing, indeed.
Caitlin Dewey is The Post’s digital culture critic. Follow her on Twitter @caitlindewey or subscribe to her daily newsletter on all things Internet. (tinyletter.com/cdewey)
The “most violently racist content” on the Internet? Sure!
by Keegan Hankes
The most violently racist internet content isn't found on sites like Stormfront and VNN any more.
One section of the Web forum is dedicated to watching black men die, while another is called “CoonTown” and features users wondering if there are any states left that are “nigger free.” One conversation focuses on the state of being “Negro Free,” while another is about how best to bring attention to the assertion that black people are more prone to commit sexual assaults than whites.
But these discussions aren’t happening on Stormfront, which since its founding in 1995 by a former Alabama Klan leader has been the largest hate forum on the Web. They’re taking place on Reddit, a huge online bulletin board recently spun off into its own independent entity from Advance Publications, the parent company of Condé Nast. Reddit has been hailed as the last bastion of free speech on the Internet, an unregulated and vibrant community of users who post whatever they want and rely on the community around them to police their content.
The world of online hate, long dominated by website forums like Stormfront and its smaller neo-Nazi rival Vanguard News Network (VNN), has found a new — and wildly popular — home on the Internet. Reddit boasts the 9th highest Alexa Internet traffic ranking in the United States and the 36th worldwide. Many of Reddit’s racist subreddits are among its most popular.
Reddit is a news site that hosts user-submitted links and discussion, organized into specific communities of interest comprised of “subreddits,” which are ranked by votes from users. If a reader believes content is a constructive contribution, he or she can “upvote” it, pushing the content further up the page. Conversely, if a user thinks that content is either off-topic or is not constructive, it can be “downvoted,” causing it to sink further down the page.
Content on Reddit is “moderated based on quality, not opinion,” according to the working document that dictates community guidelines, called “Reddiquette.” This idea of user-policed communities that contain high-quality, diverse content is part of the ethos Reddit has worked hard to project. “We power awesome communities,” reads the graphic atop its “about” page.
But awesome communities for whom?
Along with countless others with entirely different interests, Reddit increasingly is providing a home for anti-black racists — and some of the most virulent and violent propaganda around. In November 2013, a hyper-racist subreddit called “GreatApes” was formed. Users posted epithet-strewn links to “news” stories of dubious origin that riffed on long established stereotypes about the black community. GreatApes was wildly popular and grew quickly, expanding into a much larger Reddit network called “the Chimpire,” which was organized by a user known only by his or her posting name of “Jewish_NeoCon2.”
“We feel it’s time to expand our sphere of influence and lebensraum [the Nazi term for “living space”] on reddit. Thus we have decided to create ‘the Chimpire,’ a network of nigger related subreddits,” Jewish_NeoCon2 wrote at the time. “Want to read people’s experiences with niggers? There now is an affiliated subreddit for it. Want to watch chimp nature documentaries? We got it. Nigger hate facts? IT’S THERE. … Oh yes you bet we got videos of ghetto niggers fighting each other. Nigger drama on reddit? There’s a sub. Sheboons? Gibsmedat.”
Within a year, the Chimpire network had grown to include 46 active subreddits spanning an alarming range of racist topics, including “Teenapers,” “ApeWrangling,” “Detoilet,” and “Chicongo,” along with subreddits for both “TrayvonMartin” and “ferguson,” each of them dealing with the controversial and highly publicized shooting deaths of unarmed black teenagers.
Then, last November, Reddit’s most racist community evolved once again, adding the subreddit called CoonTown in the aftermath of a dispute between several top moderators at GreatApes. In just four days, CoonTown had reached 1,000 subscribers. And its popularity continues to grow.
According to Reddit Metrics, as of Jan. 6, there were 552,829 subreddits. CoonTown, with its 3,287 subscribers, ranked 6,279th, placing it in the top 2% of subreddits. It is the 680th fastest-growing subreddit on the site despite — or because of — violently racist material including a large number of threads dedicated to videos of black-on-black violence.
These gruesome videos show black men being hit in the head repeatedly with a hammer, burned alive, and killed in a variety of other ways. The subreddit’s banner features a cartoon of a black man hanging, complete with a Klansman in the background. One fairly typical user, “Bustatruggalo” applauded the graphic violence as “[v]ery educational and entertaining.” He or she continued on a separate thread: “I almost feel bad for letting an image like this fill me with an overwhelming amount of joy. Almost….”
Others, like user “natchil,” were looking for still more. “Where is watchjewsdie?” this user wondered.
'Remember the Human'
There are some limits. “No calls for violence,” the CoonTown subreddit’s description reads. “It’s prohibited by Reddit’s site-wide rules.”
Everything up to violence, however, is very much there, including the horrific content found on other Chimpire subreddits like “WatchNiggersDie” — content which is rarely, if ever, matched on forums like Stormfront and VNN, which worry about being shut down or driving off potential allies.
That’s despite the Reddiquette section’s first rule, which implores Reddit users to “Remember the human.” “When you communicate online, all you see is a computer screen,” it says. “When talking to someone you might want to ask yourself ‘Would I say it to the person’s face?’ or ‘Would I get jumped if I said this to a buddy?’”
If Reddit’s rules seem relaxed, that’s because they are meant to be. Still, although users are asked to “remember the human,” there is little humanity in the way the subjects of subreddits like CoonTown are treated.
In June 2013, however, after an extended, public controversy, Reddit did ban the subreddit “Niggers” when large numbers of its denizens began overrunning another subreddit, “BlackGirls,” with racist posts that were apparently not being policed by its moderators. “Brigading” — when large groups of people from one subreddit gang up to downvote comments on another subreddit that they don’t normally visit — is prohibited by Reddit. Users of the Niggers subreddit also engaged in “vote manipulation,” which falsely raises the popularity of a post by soliciting like-minded users to blindly upvote it. After repeated warnings and “shadow-banning,” or making a user’s posts invisible to everyone but the author, the subreddit was finally banned. According to Jewish_NeoCon2, more than a few former members of the Niggers subreddit have now taken up residence at CoonTown.
A Reluctance to Intervene
Reddit was recently spun off into its own independent entity from Advance Publications, the parent company of mass media giant Condé Nast, which also owns Vanity Fair, The New Yorker and 20 other print and online publications that reach an estimated 95 million consumers (Advance Publications is still a majority shareholder in Reddit). The site’s goal, according to Wong, is to pay its own way and its primary engine for accomplishing that is through ads, a premium subscription option, and the Reddit gift exchange.
Racist websites and organizations do sometimes benefit from racist subreddits like the Chimpire. That’s because subreddit users often post links to other racist sites, and those links drive traffic to those other sites, which in turn typically sell merchandise in addition to pushing racist ideology and recruiting.
It’s hard to dispute that Reddit does offer a venue for remarkably lively and unbridled conversation, and that dissident commentary that might not be tolerated elsewhere finds a welcome home there. Richard Spencer, a racist ideologue who heads the National Policy Institute, held an “AMA” (Ask Me Anything) session on Reddit last November, and although his views are widely regarded as loathsome, he was calm and understated in his discussion of far-right European politics. Unlike in WatchNiggersDie, there were no links to videos of brutal killings or other visual images meant to degrade the humanity of minorities.
Reddit is often hailed as one of the last bastions of truly free speech, and its owners’ hesitance to jeopardize that status is understandable given the loyal following it has inspired. Reddit has removed content that has been illegally appropriated from commercial interests, such as the revelations that emerged from the November hack of Sony Pictures Entertainment.
The Internet is awash in racist, anti-Semitic, misogynistic and other hateful content, but much of it is relatively tame. Subreddits such the Chimpire offer a window on to just how awful some of the darkest corners of the Web really are.
“We will not ban questionable subreddits,” Reddit’s then-CEO, Yishan Wong, wrote mere months ago. “You choose what to post. You choose what to read. You choose what kind of subreddit to create.”
[48 hours inside the Internet’s “most toxic” community]
48 hours inside the Internet’s ‘most toxic’ community
By Caitlin Dewey
March 26, 2015
A screenshot from a Reddit 404 page. (Reddit)
Reddit, the front-page of the Internet, hasn’t exactly had a banner year.
In August, it served as an early incubator for the Gamergate movement, which would go on wreck the lives of several innocent women and baffle America’s non-gaming populace. Not long after, it became the main distribution center for a trove of controversial stolen celebrity nude photos, including at least one of the gymnast McKayla Maroney when she was underage. To further salt the site’s wounds, a widely-publicized report released earlier this month accused the site of hosting hate groups. Reddit is, the Southern Poverty Law Center claimed, “the most hateful space on the Internet.”
… and the Internet can be a pretty hateful place.
As hardcore Redditors (and the site’s corporate owners) have pointed out, this criticism isn’t always entirely fair. Reddit is like a microcosm of the Internet itself: It’s so vast and labyrinthine and lawless that pretty much anything, good or bad, can make its home there. (“Reddit is the Mos Eisley spaceport of the Internet,” Slate’s Jacob Brogan wrote Wednesday. “A hive of scum and villainy that can carry you to the stars, if you ask around in the right places.”)[Move over, Reddit: Tumblr is the new front page of the Internet]
So two weeks ago, Ben Bell — a data scientist at the language-processing firm Idibon — set out to quantify exactly which Reddit communities were the proverbial worst. Using both language-processing software and a team of human annotators, whom you can read about in more depth here, he identified the forums where personal attacks and bigoted language were the most frequent.
At the top of the pack, ranked No. 1 for toxicity, was /r/S***RedditSays: a forum with some 64,000 members, devoted, counter-intuitively, to shaming racism, misogyny, homophobia and “toxic privilege” in the larger Reddit community.
“Take a second to think about how unwelcoming this site is for some groups,” the community’s moderators explain in its FAQ. “SRS lets those groups know that there is in fact a faction of vocal dissenters and that they aren’t alone.”
Determined to see how dark the so-called “darkest depth of the Interwebs” could possibly be, I spent 48 hours lurking in SRS and logging every conversation that bubbled up in it. The community is pretty strictly regulated: You can only post literal quotes from other Redditors, and only for the purpose of making fun of them. So the average SRS thread consists of an unsavory quote from elsewhere on Reddit, and then a long string of negative responses. Like:Reddit: Rape victims lie frequently for “monetary gain.”
SRS: “How stupid do you have to be?”
Reddit: drops a casual racial slur while talking about ISIS.
SRS: “Terrorism is meant to polarize groups, and Reddit happily helps out with that.”
Reddit: “Grad school made me racist … There’s a reason [racial] stereotypes exist.”
SRS: “If you self-identify as racist, you must have a sad existence.”
As should be fairly clear, SRS isn’t the actual source of bigotry or vitriol on Reddit: It’s just a mirror of it, a concentrated reaction to the casual bias and stereotyping that play out in other corners of the site. In fact, when SRS began, it was intended more lightheartedly: a place to gather silly or stupid comments, the same way other variants of the “s*** people say” meme do. But over time, the Guardian reported earlier this month, it became an “enclave within the site for people who have deep concerns about the main community.”
Those concerns, judging by the comments SRS has flagged in the past week, most frequently relate to sexism, racism and religious bias. And SRS has not hesitated to voice its concerns forcefully: “we have found that fighting fire with fire is substantially more gratifying” than discussion, they wrote. (Bell says his report did control for context, so SRS was rated toxic for the tone of its discussion, and not the controversial topics.) A common refrain, in response to virtually any kind of post, is “f*** Reddit” or “f*** Redditors.”
Bigotry on Reddit
The more time I spent in SRS, however, the more I realized that this is not Reddit’s fight — a fact that one moderator acknowledged explicitly in a recent interview. SRS is the most toxic place on the Internet only insofar as this debate over inclusion, diversity and “social justice” remains the most toxic debate in our culture; a debate that has, in the past year, winded around #YesAllWomen, erupted into the inferno that was Gamergate, and has reared its head with every new police shooting and rape allegation.
SRS may as well change its name to “s*** society says,” because that’s essentially what it documents: “the casual racism and sexism that is so popular,” and so insurmountable, even in mainstream, offline venues.
By the end of my allotted 48 hours, I was more than ready to log out of SRS permanently. Not because of the combativeness, necessarily, or the shaming or the “toxicity.” But I got worn down by the parade of human nastiness — and the futility of even trying to fight it.
“We are not here to ‘change reddit,'” the forum’s moderators write. “We don’t expect reddit to change. We know most redditors don’t really [care].”
They aren’t just talking about Reddit, though. And that’s a toxic problem, right there.
But in an apparent reversal of that policy, and in an unprecedented effort to clean up its long-suffering image, Reddit has just banned five “questionable subreddits.”
The site permanently removed the forums Wednesday afternoon for harassing specific, named individuals, a spokesperson said. Of the five, two were dedicated to fat-shaming, one to transphobia, one to racism and one to harassing members of a progressive video game site.
[The battle for the soul of Reddit]
Inside the battle for the soul of Reddit
By Caitlin Dewey
May 16, 2014
Think of Reddit, the Internet’s self-proclaimed front page, as the plankton of the digital information ecosystem. The vast, labyrinthine network of forums, founded in 2005, is the site where all other sites go to feed: on memes, on news stories, on ideas or whiffs of them.
But contrary the view from 10,000 feet, Reddit does not surface stories on the force of the crowd alone. Behind the Internet’s great trend-machine sits a complex, faceless hierarchy of volunteer moderators, called “mods.” Casual users never see them, and even avid Redditors — as the site’s denizens call themselves — have limited power to challenge them.
That has provoked something of an existential struggle in the Internet’s largest news forum, though few have articulated it that way. Is “the front page of the Internet” a democracy that is crowdsourced by virtual millions? Or is it a series of allied feudal kingdoms, steeped in abstract politics?
“The system has its flaws,” admitted Erik Martin, Reddit’s general manager. “But it’s a powerful system that for the vast majority of [the Web site] works great.”
“Works great” is, of course, a relative assessment. While a whopping 110 million people visited Reddit last month — by comparison, the mammoth CNN.com averaged 67 million monthly visitors in 2013 — segments of the site are rife with accusations of moderator censorship, neglect or abuse.
Never has that been clearer than in the past four months, when two of the site’s most popular forums, r/news and r/worldnews, repeatedly deleted a major scoop about British intelligence by Glenn Greenwald. Less than three weeks later, a user in another major forum, r/technology, reported that mods systematically blocked terms like “NSA” and “net neutrality.”
“This is real bad news,” concluded a Redditor codenamed “creq” in his post uncovering r/technology’s blocked terms. “This place is heavily censored.”
In reality, the forum wasn’t so much censored as poorly moderated. But in either case, the incident exposed a more troubling and more systemic drawback of the site: When you hand such profound power to anonymous moderators, the Internet is essentially at their whim.
A screenshot of the Reddit frontpage, as of this writing. (Reddit)
That’s particularly true on a site like Reddit, where the politics and power dynamics are opaque obtuse, even to people on the inside. Essentially, Reddit consists of a series of forums, called “subreddits,” which anyone can create. Within each subreddit, users can submit posts (links, photos, questions, etc.) and either up- or down-vote the posts other users have submitted. Unlike Twitter and Facebook, with their ever-evolving, user-friendly interfaces, Reddit’s bare-bones design — sky-blue header, white background, plain Verdana text — has hardly changed since 2005. Each subreddit is a list of links. Each post includes a series of nested comments.
Posts that have been up-voted by large numbers of Redditors move up the subreddit’s “hot” list, thereby gaining visibility. Enough votes can push a post to Reddit’s front page, where it’s exposed to millions of people.
That up- and down-voting system would appear to signal a certain degree of democracy — and it does. As Martin explains, the great benefit of Reddit’s infrastructure is that it allows communities to create spaces entirely of their choosing.
But neither the Reddit front page, nor the network of subreddits behind it, operates without controls. Each subreddit is operated by a moderator, who can make and enforce any editorial decision he wants (including whether or not to appoint other mods). Most of the time, those decisions make sense and help the community run in a smooth manner: deleting spam, blocking disruptive members, that kind of thing. But mods also have the power to delete posts they don’t like or whose politics they disagree with. Alternately, they can slack off their moderation duties to the point the forum fills with junk or spam.
In either case, corporate Reddit tries not to intervene. “We don’t want to be referees,” Martin said. If the company’s three community managers must show their disapproval, they do it subtly — rearranging the highly visible default subreddits to which all users automatically belong.
In Reddit’s nine-year history, overseers have only had to demote default subreddits a handful of times. Last year, they bumped r/politics and r/atheism in favor of more active communities, like r/television and /gifs. Then, just last month, another long-time default fell from grace: the embattled r/technology, which — with its 5 million members — is one of the largest forums on the site.
The battle for r/technology
A screenshot from creq's post. (Reddit) A screenshot from creq’s post of filtered words in r/technology. (Reddit)
By comparison, the user who brought down r/technology was a small fish in the vast Reddit waters. Creq, a Redditor of only seven months, didn’t moderate any big-time forums or hoard any considerable store of karma — Reddit shorthand for street cred — when he noticed that certain topical words seemed to appear in the forum much less than one would expect. On April 13, he posted a list of “banned” keywords to r/technology, alleging that mods censored posts that contained those terms. The list included everything from “NSA,” “Snowden” and “spying” to “Dogecoin” and “Flappy.”
“Can we create a … spinoff, to get away from the anal mods?” one user commented.
“They’re not anal,” another fired back. “They’re corrupt.”
In point of fact, it seems, the r/technology mods were just complacent: with too many users, and too few active mods, they created a bot to automatically torpedo links on potentially spammy or politicized subjects. But that means that, for a window of several months, stories on those subjects — net neutrality, the NSA scandal, cryptocurrencies — disappeared from one of the Internet’s most important technology forums.
To further complicate things, many of r/technology’s mods were so-called “super-users” — senior Redditors who had accumulated massive amounts of karma and who often ran several major forums on the site. One of them also moderated r/bestof, r/food and r/history, all default subreddits. Another oversaw more than 360 other subreddits, including the wildly popular r/EarthPorn. (None of the moderators contacted by the Post responded to requests for comment.)
Critics wanted the top mods to resign, explain their policies, or some combination of the two. The ensuing back-and-forth, which lasted days, essentially brought the forum to a standstill. On April 17, Reddit pulled r/technology from the default subreddits banner, citing rampant dysfunction there.
“They had no critical mass to moderate the community,” Martin said. “We saw this as an opportunity to kind of say — hey, let’s see what happens. Maybe other subreddits will attract that audience.”
The war for Reddit
That principle, what Martin and others at Reddit HQ like to call “market fluidity,” is actually one of the fundamental underpinnings of the laissez-faire Reddit system. If moderators act out, the logic goes, Redditors can just take their business elsewhere — maybe to r/tech, which has grown markedly in recent weeks, or r/futurology, upon which Reddit recently bestowed default status. Market fluidity is, in theory, a check on moderators’ influence: If they become too powerful, or too irresponsible, users can simply leave.
Unfortunately, Martin admits, the system often doesn’t work that way in practice. A subreddit with a critical mass is not easily toppled, particularly when it’s held by a powerful mod with control of multiple subreddits on the site. Spin-off forums with odd names can be hard to find by search. Plus those super-mods, some users have complained, belong to expansive political networks that can stifle dissent across subreddits or punish users who act out.
One particularly melodramatic Redditor, bemoaning the inequity during the battle for r/technology, compared the situation to the French Revolution: “When French peasant stormed the Bastille, pretty much every royal in Europe started hand-wringing and condemning popular dissent … see if you can apply the analogy to these facts.”
That metaphor’s a little dramatic, of course, but it expresses the essential, existential question of Reddit as it grows up. Can there be such a thing as pure democracy online? Or does the web require something else?
In the weeks since the great r/technology coup, Reddit has failed to answer that question. While the community has new moderators and a new transparency policy, it still suffers from intermittent in-fighting. Creq, the r/technology user who first spotted the censorship, is now a mod of the forum, himself. The majority of his last thousand posts are retorts to Redditors accusing him of abuse.
The mods come and go, it seems; the system stays. And still, behind a curtain many Internet-users don’t even acknowledge, a cabal of faceless, nameless wizards work controls that we can’t see.
All five subreddits were warned previously, the company said. And administrators will watch the site carefully to make sure those five subreddits don’t pop up again.
“We want to be open about our involvement: We will ban subreddits that allow their communities to use the subreddit as a platform to harass individuals when moderators don’t take action,” the company said in a statement. “We’re banning behavior, not ideas.”
What kind of behavior, you ask? These are the five subreddits that were banned.
(Via Internet Archive)
In their own words: “Absolutely NO FAT SYMPATHY.”
Chief offense: A clearinghouse for lifted photos of overweight people from around the Web. The only rules for stealing and posting these photos — beside the aforementioned ban on “sympathy” — was that submissions include no identifying information.
Founded: October 2013
In their own words: “We are a Pro-Health sub! No ifs, buts, or coconuts. With that being said, if you’re a delusional lardmuffin this sub may be a bit offensive.”
Chief offense: Posting pictures of overweight people, frequently from Facebook, Flickr and similar photo-sharing sites, and relentlessly making fun of them in vicious comment threads. One recent photo showed a smiling couple standing outside with the caption “what a happy little hamily.”
(Via Internet Archive)
In their own words: “Tired of transgender people and their degeneracy? Disgusted by trans things? Hate the intolerant and whiny transgender community always playing the oppression card? This sub is for you. We aim to ridicule and mock the transgender community because they deserve to be laughed at.”
Chief offense: A running “tranny of the day” feature that pulled photos of individuals from Reddit’s pro-trans subs for the purpose of harassing them. “Mocking photos of [trans people] is okay,” the subreddit’s rules said, “but use imgur instead of linking to their submission if its on Reddit.” The purpose of that work-around, of course, is to avoid anyone discovering it. r/Transf–s also hosted threads on topics like “the best way to tell a [trans person] to kill themselves.”
Founded: December 2013
In their own words: “We generally allow anything. Anyone is welcome to post here and talk s—, or have a serious discussion. Here freedom of expression is sacred — not a lousy principle worked around to protect untenable ideologies and crybabies.”
Chief offense: Unlike the other banned subreddits, which targeted broad groups of people, r/Neof– had a narrower purpose: Harassing members of NeoGAF, a “civil, inclusive” gaming site, and its founder, Tyler Malka. Among other things, members posted pictures of GAF moderators and mocked their appearance.
Founded: July 2012
In their own words: “Listen to stuff n—–s say, both on Reddit and anywhere else on the web.”
Chief offense: SNS was a member of “The Chimpire,” a disgusting and wide-ranging network of racist subreddits that the Southern Poverty Law Center named the Web’s worst earlier this year. While the others are regrettably still around, SNS seems to have been banned for copy/pasting things black Redditors said in other forums and then going after them.
Notably, none of these forums were in violation of the Reddit rules even three or four weeks ago: The ban on “attacks and harassment of individuals” was just instituted by CEO Ellen Pao in mid-May. These five subreddits were the first to be axed, a spokesman said, because of the volume of user complaints.
Unsurprisingly, a vocal contingent of Redditors aren’t taking the changes well: “Reddit increases censorship,” read one post on r/freespeech, while forums like r/mensrights and r/opieandanthony theorized they would be next.
But as I wrote in February when the Reddit-knockoff Voat began flying the “anything goes” flag, that attitude — and this crackdown — is actually pretty indicative of the state of “free speech” on the Web. A number of sites that started out as absolutists have realized — particularly as they grow more mainstream — that they also have other corporate and moral responsibilities. If you restrict absolutely nothing, you get child porn. If you define “abuse” too broadly, you watch users leave in droves. Even Christopher Poole, the founder of 4chan, cracked down on his cesspool towards the end (before leaving the site in January, totally exhausted).
This is what happens when you create an online community without any rules
By Caitlin Dewey
January 13, 2015
This domain has been put on hold. Check the reason below.
If you have any question, please send an email to: email@example.com
8chan, the more-lawless, more-libertarian, more “free” follow-up to 4chan, disappeared from the Internet under predictable circumstances Monday: Multiple people complained to 8chan’s registrar that the message board hosted child porn.
8chan has since resurfaced at a new URL, 8ch.net, and purportedly recovered its original domain. But that doesn’t erase the inevitable lesson of the matter: When you create an Internet community with virtually no rules, things are bound to go down the drain.
That is not, needless to say, the philosophy of 8chan’s members — nor its polarizing, lionized overseer, Fredrick Brennan. Brennan, a 4chan user since age 12, started 8chan in October 2013 after taking mushrooms and dreaming up a “free speech friendly 4chan alternative.” Like 4chan, Brennan’s forums would be anonymous communities where users could post text and images in nested, themed comment threads. But unlike 4chan, Brennan promised, his Internet utopia would allow anything and everything — provided only that it didn’t violate U.S. law.
Hacking Thread Anonymous 01/13/15 (Tue)
We need to develop beyond ddos and learn how to be a true terror to the SJW menace
I'll upload a few of my hacking ebooks here, simply beginner guides anyone can learn from.
Here is a good set of sites to use:
A guide for activists and protesters created by the Electronic Frontier Foundation.
https://ssd.eff.org/en/playlist/online- ... hy-and-pgp
An EFF guide to PGP
Site containing a wealth of resources. Very useful. The list of software and media is especially helpful.
8channers discuss the best way to terrorize “social justice warriors,” or SJWs. (8chan)
To advocates of free speech and a free Internet, Brennan’s vision was refreshing — liberating, even. 8chan gained a small, loyal following on its launch in 2013 and blew up a year later when 4chan clamped down on Gamergate-related threads. Thousands of angry users fled to 8chan, quickly making it the second-most popular imageboard site on the Web.
“Imageboards are the most important medium for free speech on the Internet,” Brennan told Know Your Meme in the midst of that exodus. “Imageboards are a haven for [terrible things] … and that’s exactly what makes them such wonderful places. I wouldn’t change a thing.”
The problem with terrible things, of course, is that they tend to take on lives of their own. From the principled safety of 8chan, Gamergate supporters launched a number of campaigns against female journalists and videogame makers — some of which the FBI is purportedly investigating.
Welcome to Baphomet
THIS BOARD IS WELL WITHIN THE CONFINES OF US LAW YOU [DELETE] STOP WHINING!
Contact admin @BenjaminBiddix on Twitter for any concerns, questions or just for [DELETE]
What is "baphomet"? - The last stand for old anon, a cancer free heavily moderated random conversation board with a secondary concentration on raids, doxing and old anon luls.
All forms of cancer, unfunny [DELETE] posts and autism are not tolerated here.
Raid took kit and etc. can be found here:
The rules for a popular doxing board. (8chan)
Meanwhile, Brennan has welcomed forums dedicated to pedophilia, suicide and concerted harassment or trolling. He does not personally police those forums for illegal content, per 8chan rules; instead, he trusts the creators of those forums and a “team of volunteers” to do it themselves.
It is no wonder that 8chan hosts, in the words of Gizmodo’s Chris Mills, “some of the nastiest s*** on the Internet.” Not explicitly illegal stuff, mind you, but stuff in the gray area, nonetheless: think threatening “dox” files on unsuspecting victims and softcore photos of children wearing thongs.
Brennan and his supporters — of whom there are many — point out that this is, legally, their right. Under the Communications Decency Act, a ’90s law that basically paved the way for free speech online, Web site administrators are not legally responsible for what their users post, no matter how gross their posts get.
Suika 11/26/14 (Wed)
Things to Consider
>>>/hebe/ "is great place of image dumps, including images that violate this board's rule #2."
A pedophile is defined as someone who is sexually attracted to prepubescent children.
Sexual attraction to pubescent children is hebephilia, while sexual attraction to adolescents is ephebophilia.
Being a pedophile does not mean someone is, or ever will be, a child molester or rapist. Both molestation and rape are an action which involve a victim while pedophilia is a passive attraction.
Having a sexual attraction to children isn't illegal, acting on such desires can be however.
A post on a popular pedophilia board. (8chan)
There are only two exceptions — copyrighted content and child porn — and 8chan claims to police those things closely. It’s worth noting, however, that when a number of people reported 8chan’s active pedophilia boards to Cloudflare, a company that protects the site from malicious traffic, Brennan took screenshots of their names and e-mail addresses … and tweeted them publicly.
Previously, asked what he thought about the pedophilia boards on his site, Brennan called them “simply the cost of free speech.”
Of course, free speech has another cost, as 8chan is learning. Sure, you can preach your absolutism from the rooftops, and promise a principled haven for even the most destructive of things. But maybe don’t be super-surprised when your domain gets seized.
Racist and hateful and harassing speech won’t disappear with these subreddits, of course. Already, a number of them have made the leap to Voat.
What they don’t realize, however, is that if Voat grows more popular, it will also need to begin cleaning house. And then, in the same tired cycle, someone else will deservedly kick them out.
Caitlin Dewey is The Post’s digital culture critic. Follow her on Twitter @caitlindewey or subscribe to her daily newsletter on all things Internet. (tinyletter.com/cdewey)