Pornhub is scrambling to clean up its act, it says, in the aftermath following New York Times opinion columnist Nicholas Kristof’s piece “The Children of Pornhub,” which is what it sounds like. In the days after publication, Senator Josh Hawley (R-MO) threatened to introduce legislation (as he does), and Visa and Mastercard have announced that they’re reassessing whether they want to do business with Pornhub’s owner Mindgeek. The company announced today in a blog post that it’s stopping unverified users from uploading videos—which, ideally, would do something to help curb nonconsensual porn, documentation of assault, and child sex abuse material (CSAM).
Until sometime next year, Pornhub says it’ll only allow members of its model program and content partners to upload to its site. Model program members have to submit a government ID, and content partners (producers promoted by Pornhub) are vetted by Pornhub. The company declined to describe the content partner verification process to Gizmodo, but that category doesn’t mean much unless Pornhub commits to change; last year, the company failed to remove a notorious content partner which had allegedly lied to women and coerced them to make videos until a federal sex trafficking indictment. Historically, it’s not responded to personal requests or countless sex workers and activists without Times bylines.
The company has also banned downloads, but it’s pretty easy to break that rule.
If Pornhub is serious about seismic change, this could mean an overhaul for porn tubes. Pornhub told Gizmodo that the change applies to all sites owned by Mindgeek, whose properties represent the vast majority of major amateur porn sites. And by many accounts, it seriously needs to improve its moderation efforts to respond before legal threats or bad press.
Advertisement
The Times reported that Pornhub would not comment on the size of its moderation team, but a moderator told the paper that only around 80 moderators work for the entire Mindgeek network, compared to YouTube’s 10,000. Both rank in the top 10 most visited sites in the U.S. To put it another way, Pornhub’s moderators would have to be watching 6.8 million videos between them per year. In its announcement, Pornhub says that it has an “ extensive team of human moderators,” but this includes users who flag material.
Pornhub now seems to be modeling a moderation infrastructure in Facebook and Twitter’s likeness, with Transparency Reports to roll out in 2021. Pornhub also says that it uses various automated tools, such as YouTube’s child sex abuse detection technology. The company added that in April 2020, it engaged a law firm to conduct an independent review.
G/O Media may get a commission
In an email to Gizmodo yesterday, Pornhub pointed us to a report citing the child sex abuse material watchdog, the Internet Watch Foundation, which claimed that it only identified 118 pieces of CSAM on Pornhub between 2017 and 2019. (The Internet Watch Foundation was unavailable for comment.) The company couched this as a low number, compared to hundreds of thousands, and millions, of CSAM violations self-reported by Facebook and Twitter. But this is just on Pornhub, not across MindGeek’s vast porn dominion, and Facebook and Twitter’s rules around violations are much stricter.
Mindgeek was not immediately available for comment.
Advertisement