Last week, the New York Times published an op-ed titled “The Children of Pornhub,” detailing how the site turned a blind eye to videos of rape, both of children and adults, and other abuses. The article exploded on social media and on Tuesday, Pornhub issued a response about its “commitment to trust and safety.”

Effective immediately, Pornhub will only allow content partners and those within the Model Program to upload videos. Further, they’re banning downloads (with the exception of paid downloads from their Model Program) and expanding content moderation. The site has also partnered with the National Center for Missing & Exploited Children (NCMEC) this year to build a report of child sexual abuse material (CSAM) cases. 

The New York Times piece is a devastating read, and unfortunately far from the first. 

In the statement, Pornhub claims that “nothing is more important than the safety of our community,” yet it took them months to remove videos from the site GirlsDoPorn, which tricked women into having sex on video then subsequently harassed them when they sued GirlsDoPorn.

The same happened two years ago with deepfake videos — Pornhub promised to remove them, but only did so when it received media attention. 

“The Children of Pornhub” didn’t suddenly wake the site up to the disgusting, illegal videos available in seconds with the right search term. There are numerous cases from this year alone where teen girls had to fight with Pornhub to get videos of themselves removed. 

“Traffickinghub,” a campaign to shut down Pornhub, had a million signatures as of June; now they have more than double that. It’s worth noting that Exodus Cry, which is behind the Traffickinghub campaign, is ultra conservative — but nonconsensual deepfakes, GirlsDoPorn, and “The Children of Pornhub” are real incidences of exploitation that Pornhub knew about and did nothing about until they received widespread coverage. 

The new protocols released Tuesday aren’t the first time that Pornhub has implemented new moderation features to curb illegal and nonconsensual videos. They claimed that victims could easily remove these videos by “fingerprinting,” but VICE found that it didn’t even work

VICE titled the article “Pornhub Doesn’t Care” — and it’s difficult to believe they do now. It’s yet to be seen whether these new rules will actually have an impact. Early next year, the site and NCMEC will release the total number of CSAM reports. Whatever that number is, it’ll be appalling — and a result of Pornhub’s negligence. 

If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.

In den letzten jahren haben wir eine vielzahl von technologien entwickelt, die uns helfen, die umweltbelastung zu reduzieren.