Reddit released its 2022 transparency report Wednesday, in part showing that the site has been laying on the “ban” button hard after the company started a crackdown on non-consensual intimate media (NCIM) after a rules change in March of last year. These instances of NCIM include any kind of intimate image posted without consent or a person’s awareness, such as revenge porn, voyeurism, or an “accidental nip-slip moment.”

Reddit had previously banned any kind of porn posted “without permission.” In March 2022, the company changed the term “involuntary pornography” to “non-consensual intimate media” and marked it in the same policy category as posting somebody else’s personal or confidential information.

Advertisement

After the change and once the banhammer started up, the site saw a 473% increase in subreddit removal and 244% increase in account bans for violating rules about non-consensual material. Reddit said this reflected the change in NCIM policy and an “increased effectiveness in detecting and removing” that content.

At the same time, a little under 174,000 non-consensual intimate media posts and comments got deleted from Reddit in 2022 compared to 2021’s more than 187,000 instances of “involuntary pornography.” While less overall content was nixed from the site, Reddit has been much more willing to completely remove subreddits and ban accounts. There were 368 subreddit removals for NCIM in 2021 compared to 2,109 in 2022. Over 54,000 accounts were permabanned in 2022 for sharing non-consensual material versus 15,847 in 2021.

Advertisement

Advertisement

In total, the site said it removed 316.7 million pieces of content in 2022 compared to nearly 297 million in 2021. As far as permabans go, those are up nearly five times as much compared to last year, with most suspensions due to people making new accounts to get around other bans. NCIM still remains a small part of the reason for bans. For example, there were over 134,000 accounts permanently suspended for harassment and nearly 80,000 for spreading “hateful content.” Most bans come from Reddit’s automod or bots, compared to manually by site moderators.

Reddit had previously banned any kind of porn posted “without permission.” In March 2022, the company changed the term “involuntary pornography” to NCIM and marked it in the same policy category as posting somebody else’s personal or confidential information.

Advertisement

Reddit has been previously sued for allegedly allowing child abuse sexual material (CSAM) to sit idle on the platform. The site’s transparency report notes it uses automated tools as well as human reviewers to find CSAM on the platform. In 2022, Reddit said it removed 80,888 pieces of CSAM content, most of those in the latter half of the year. In 2021, the National Center for Missing & Exploited Children reported it received a little over 10,000 notices of CSAM on Reddit. The company said it filed a total of 52,592 reports to NCMEC in 2022.

Along with the new report, Reddit showed off the new Transparency Center, which includes earlier transparency reports as well as more on Reddit’s policies.

Advertisement

Reddit claimed it received 51% more requests from governments and law enforcement agencies to take down pages and posts compared to 2021, though nearly 60% were requests to remove non-consensual porn. Russia’s Roskomnadzor agency, in charge of the country’s mass media, tried to get Reddit to remove several posts that commented on the Russian invasion of Ukraine. Reddit said of the 42 pieces of content, it found 32 of the posts didn’t violate the site’s rules.

At the same time, India hit Reddit with close to 50 takedown requests regarding 276 pieces of content or communities. Many of these requests were to both take down content and share information about the user. Reddit said it removed 92% of that content on request. The site also geo-restricted several subreddits that contained non-NCIM porn upon request from India and Pakistan.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums