Why Reddit Is Losing Its Battle with Online Hate

New research shows how the message board keeps giving bigotry a home.

March 15, 2019 - UK - Stock photo of the Reddit social media app icon on a smartphone. (Credit Image: © Nick Ansell/PA Wire via ZUMA Press)

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Reddit’s dark corners can seem like a dangerous and seedy mess. With over 330 million users, the message board platform is vast and filled with posters obsessed with sports, hobbies, and hyper-specific public transit memes. But its far-right communities have peddled false-flag conspiracy theories, spread Islamaphobic and anti-semitic content, and encouraged violence. While the company has banned some of its worst message boards, it has often ignored or been slow to take action on other hateful communities.

Over the past several years, Reddit has moved to ban toxic, racist boards like r/EuropeanNationalism and r/MillionDollarExtreme, while allowing other communities, like r/CringeAnarchy and the r/The_Donald, to stay online even as their users posted racist comments in the wake of mass shootings and defended such violence. And according to the authors of a June academic paper on the company’s moderation practices, unless Reddit makes significant changes to how it enforces its policies, the company will always be at least one step behind in its battle against hateful content.

The study is based on the work of computer science researchers from the University of Iowa and the Lahore University of Management Sciences who analyzed over three thousand popular Reddit communities, or subreddits, to examine how Reddit polices the worst content on its platform. Their comprehensive dataset—which included hundreds of hateful, banned subreddits—accounted for 6 percent of all posts and 11 percent of all comments made on Reddit between 2015 and 2018. 

The researchers’ report two major findings. The first is that users’ affiliations can be used to predict, often months in advance, which Reddit communities will turn so hateful and dangerous that they will earn a ban. While that indicates Reddit could proactively stop subreddits before they spiral into cesspools, the researchers’ second conclusion is that the company often ignores such signals, and instead only haphazardly enforces its platform rules, often in the wake of media attention on certain, particularly problematic subreddits.

The result is that users of banned subreddits can migrate over to other dangerous communities that have been allowed to stay online, where hate and harassment can continue unchecked. For example, when Reddit banned the fascist and violence-advocating r/Physical_Removal and the sexist and racist r/MillionDollarExtreme, many users likely moved to other similar subreddits like r/KotakuInAction and r/CringeAnarchy (which wasn’t banned until April) where they could post the same kinds of content that got their previous subreddits banned.

In each of these cases, the communities had developed strong reputations for being home to hateful, policy-violating content, garnering attention from mainstream news sites as Reddit let them stay up for months before taking action. One, r/KotakuInAction, a hub of the misogynist GamerGate movement, remains online thanks to Reddit’s intervention. After the subreddit’s creator decided to shut it down for, in his words, becoming “infested with racism and sexism,” a company representative stepped in to prevent it from closing, according to The Verge.

“Reddit isn’t very consistent with the way that it enforces its policy,” says Rishab Nithyanand, an author of the paper, pointing to the team’s research suggesting that Reddit, rather than impose clear and consistent moderation standards, usually only takes action after media stories highlight a particularly egregious subreddit.

“While community A and B are discussing the same terrible things, A gets banned because it gets picked up by the press, but B carries on,” says Nithyanand, a computer science professor at the University of Iowa. Community B, he explained, then becomes a safe haven for displaced members of A, allowing the spread of hate speech and bigotry to continue almost unchecked.

There have been recent high-profile examples of such a cycle. After the Christchurch mosque shooting in New Zealand, users of two communities, r/CringeAnarchy, and r/The_Donald, posted bigoted content justifying the killings, almost certainly in violation of Reddit’s terms of service. But the company did not ban the communities at the time, in spite of the abhorrent content, and both communities’ history of hosting bigotry. It did finally ban r/CringeAnarchy a month later, and in June eventually “quarantined” r/The_Donald, its term for putting a subreddit on probation and making it harder for users to find its content. The step is often followed by a complete ban.

The authors found that a subreddit’s descent into hate can be fairly reliably predicted by the number of its members who are a part of already banned communities or who are a part of other hateful, but not yet banned communities. Using a model based on data recording users’ participation in multiple subreddits, the group says it was able to use early data to predict which subreddits would eventually earn a ban with 70 to 90 percent accuracy.

“This suggests that administrator and community moderation tools which rely on measuring the connectivity of subreddits to known hate or banned subreddits can be used to pre-emptively identify which subreddits require careful monitoring or even administrator/moderator interventions,” the authors wrote.

Nithyanand thinks that he and his colleagues’ findings strongly indicates that Reddit needs to radically reform its moderation of hateful content, from the application of its policies to the tools that it is using by incorporating more machine learning and A.I. to assist human moderators spot and handle content potentially in violation of the platforms’ rules. While Reddit has grappled with how to handle hate speech, the company has boasted of using such “proactive” systems it its work to help detect, even before users complain, attempts to manipulate the popularity of content on the platform.

Nithayanand believes that if Reddit consistently banned all communities violating its policies, the company would make it harder for the site’s worst users to find new homes and keep spreading bigoted, homophobic, and sexist messages, promoting violence, and otherwise break Reddit’s rules. This would be a significant change from Reddit’s usual practice of taking action, if at all, on communities at inconsistent points in their evolution. 

“The power and influence they have when they have to go and create a whole new community is significantly lower. The number of users that carry on posting in a new community is far lower than the original,” Nithanyanad said. “But if they’re already in two well-established homes, but one got banned, that doesn’t change their level of participation.”

Reddit declined to comment on the record about the paper or related critiques of the company’s moderation policies.

Nithayanand’s research stands in contrast to past work suggesting banning single subreddits reduces users hateful behavior, including a 2015 paper by academics at the Georgia Institute of Technology, Emory, and the University of Michigan. Those researchers found that after several communities were banned, members of those board who went on to post in different subreddits used 80 percent less in hate speech in their new postings. While that paper was based on data from only two banned subreddits—r/FatPeopleHate and r/CoonTown—Nithyanand’s more recent analysis included over 3,000 subreddits, including hundreds of offensive, banned, or quarantined subreddits, and more closely examined users’ behavioral changes, both when they first join a community and when it is banned. 

Given its role in playing host to and nurturing hateful content, Reddit has largely avoided related public controversies that have dogged larger and more widely-known platforms like YouTube and Facebook. But the company has played a pivotal role in online hate, acting as a bridge between right-wing communities on those mainstream platforms and hateful, far-right communities on 8chan and elsewhere. In stepping up their enforcement efforts, Reddit could interrupt a key pathway to bigotry.

WE'LL BE BLUNT

It is astonishingly hard keeping a newsroom afloat these days, and we need to raise $253,000 in online donations quickly, by October 7.

The short of it: Last year, we had to cut $1 million from our budget so we could have any chance of breaking even by the time our fiscal year ended in June. And despite a huge rally from so many of you leading up to the deadline, we still came up a bit short on the whole. We can’t let that happen again. We have no wiggle room to begin with, and now we have a hole to dig out of.

Readers also told us to just give it to you straight when we need to ask for your support, and seeing how matter-of-factly explaining our inner workings, our challenges and finances, can bring more of you in has been a real silver lining. So our online membership lead, Brian, lays it all out for you in his personal, insider account (that literally puts his skin in the game!) of how urgent things are right now.

The upshot: Being able to rally $253,000 in donations over these next few weeks is vitally important simply because it is the number that keeps us right on track, helping make sure we don't end up with a bigger gap than can be filled again, helping us avoid any significant (and knowable) cash-flow crunches for now. We used to be more nonchalant about coming up short this time of year, thinking we can make it by the time June rolls around. Not anymore.

Because the in-depth journalism on underreported beats and unique perspectives on the daily news you turn to Mother Jones for is only possible because readers fund us. Corporations and powerful people with deep pockets will never sustain the type of journalism we exist to do. The only investors who won’t let independent, investigative journalism down are the people who actually care about its future—you.

And we need readers to show up for us big time—again.

Getting just 10 percent of the people who care enough about our work to be reading this blurb to part with a few bucks would be utterly transformative for us, and that's very much what we need to keep charging hard in this financially uncertain, high-stakes year.

If you can right now, please support the journalism you get from Mother Jones with a donation at whatever amount works for you. And please do it now, before you move on to whatever you're about to do next and think maybe you'll get to it later, because every gift matters and we really need to see a strong response if we're going to raise the $253,000 we need in less than three weeks.

payment methods

WE'LL BE BLUNT

It is astonishingly hard keeping a newsroom afloat these days, and we need to raise $253,000 in online donations quickly, by October 7.

The short of it: Last year, we had to cut $1 million from our budget so we could have any chance of breaking even by the time our fiscal year ended in June. And despite a huge rally from so many of you leading up to the deadline, we still came up a bit short on the whole. We can’t let that happen again. We have no wiggle room to begin with, and now we have a hole to dig out of.

Readers also told us to just give it to you straight when we need to ask for your support, and seeing how matter-of-factly explaining our inner workings, our challenges and finances, can bring more of you in has been a real silver lining. So our online membership lead, Brian, lays it all out for you in his personal, insider account (that literally puts his skin in the game!) of how urgent things are right now.

The upshot: Being able to rally $253,000 in donations over these next few weeks is vitally important simply because it is the number that keeps us right on track, helping make sure we don't end up with a bigger gap than can be filled again, helping us avoid any significant (and knowable) cash-flow crunches for now. We used to be more nonchalant about coming up short this time of year, thinking we can make it by the time June rolls around. Not anymore.

Because the in-depth journalism on underreported beats and unique perspectives on the daily news you turn to Mother Jones for is only possible because readers fund us. Corporations and powerful people with deep pockets will never sustain the type of journalism we exist to do. The only investors who won’t let independent, investigative journalism down are the people who actually care about its future—you.

And we need readers to show up for us big time—again.

Getting just 10 percent of the people who care enough about our work to be reading this blurb to part with a few bucks would be utterly transformative for us, and that's very much what we need to keep charging hard in this financially uncertain, high-stakes year.

If you can right now, please support the journalism you get from Mother Jones with a donation at whatever amount works for you. And please do it now, before you move on to whatever you're about to do next and think maybe you'll get to it later, because every gift matters and we really need to see a strong response if we're going to raise the $253,000 we need in less than three weeks.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate