Going forward, Pornhub will only allow properly identified users to upload content. It has banned downloads. It has made some key expansions to its moderation process, and it recently launched a Trusted Flagger Program with dozens of non-profit organizations. Earlier this year, it also partnered with the National Center for Missing & Exploited Children, and next year Pornhub will issue its first transparency report. Full details on the expanded policies can be found below.
“At Pornhub, nothing is more important than the safety of our community,” Pornhub said. “Our core values such as inclusivity, freedom of expression and privacy are only possible when our platform is trusted by our users. This is why we have always been committed to eliminating illegal content, including non-consensual material and child sexual abuse material (CSAM). Every online platform has the moral responsibility to join this fight, and it requires collective action and constant vigilance. Over the years, we have put in place robust measures to protect our platform from non-consensual content, and we are constantly improving our trust and safety policy to better flag, remove, review and report illegal material. We hope that we have demonstrated our dedication to leading by example.”
Verified Uploaders Only
Effective immediately, only content partners and people within the Model Program will be able to upload content to Pornhub. In the new year, Pornhub will implement a verification process so that any user can upload content upon successful completion of identification protocol. Pornhub has retroactively removed any content that has not been uploaded by a content partner or people within the Model Program, pending verification.
Banning Downloads
Effective immediately, Pornhub has removed the ability for users to download content from the platform, with the exception of paid downloads within the verified Model Program. In tandem with Pornhub’s fingerprinting technology, this will mitigate the ability for content already removed from the platform to be able to return.
Expanded Moderation
Pornhub has worked to create comprehensive measures that help protect its community from illegal content. In recent months Pornhub deployed an additional layer of moderation. The newly established “Red Team” will be dedicated solely to self-auditing the platform for potentially illegal material. The Red Team provides an extra layer of protection on top of the existing protocol, proactively sweeping content already uploaded for potential violations and identifying any breakdowns in the moderation process that could allow a piece of content that violates the Terms of Service. Additionally, while the list of banned keywords on Pornhub is already extensive, Pornhub will continue to identify additional keywords for removal on an ongoing basis. Pornhub will also regularly monitor search terms within the platform for increases in phrasings that attempt to bypass the safeguards in place. Pornhub’s current content moderation includes an extensive team of human moderators dedicated to manually reviewing every single upload, a thorough system for flagging, reviewing and removing illegal material, robust parental controls, and utilization of a variety of automated detection technologies. These technologies include:
CSAI Match, YouTube’s proprietary technology for combating Child Sexual Abuse Imagery online
Content Safety API, Google’s artificial intelligence tool that helps detect illegal imagery
PhotoDNA, Microsoft’s technology that aids in finding and removing known images of child exploitation
Vobile, a fingerprinting software that scans any new uploads for potential matches to unauthorized materials to protect against banned videos being re-uploaded to the platform.
If a user encounters a piece of content they think may violate the Terms of Service, please immediately flag the content or fill out the Content Removal Request Form, which is linked on every page.
Pornhub‘s policy is to immediately disable any content reported in the Content Removal Request Form for review.
Trusted Flagger Program
Pornhub recently launched a Trusted Flagger Program, a new initiative empowering non-profit partners to alert the platform of content they think may violate the Terms of Service. The Trusted Flagger Program consists of more than 40 leading non-profit organizations in the space of internet and child safety. Partners have a direct line of access to Pornhub’s moderation team, and any content identified by a Trusted Flagger is immediately disabled. Partners include: Cyber Civil Rights Initiative (United States of America), National Center for Missing & Exploited Children (United States of America),