Pornhub Is Inspired By Facebook And Twitter To Deal With Its Neglect Problem, But Can It Be Trusted?
Pornhub Is Inspired By Facebook And Twitter To Deal With Its Neglect Problem, But Can It Be Trusted?
Pornhub has a serious content problem, particularly with how nonconsensual videos of young girls and sexual violence have been allowed to be uploaded and downloaded from the platform. Pornhub at this time doesn’t reveal any data about accounts or content they have actioned upon for violating policies.

Adult video streaming platform Pornhub has announced a new set of policies which includes stricter content moderation, disabling downloads of porn videos, restricting uploads to members and content partners and introducing a verification process for regular users. This comes after a New York Times report detailed how uploads from unidentified users on the platform were nonconsensual videos and often involved underage girls. This has forced Pornhub to change the very model that it was built on, by no longer accepting video uploads from unidentified users and non-professional uploads. This comes just days after payment companies Mastercard and Visa said they would investigate their financial agreements with MindGeek, the company that owns Pornhub.

In an official statement, Pornhub says, “Today, we are taking major steps to further protect our community. Going forward, we will only allow properly identified users to upload content. We have banned downloads. We have made some key expansions to our moderation process, and we recently launched a Trusted Flagger Program with dozens of non-profit organizations. Earlier this year, we also partnered with the National Center for Missing & Exploited Children, and next year we will issue our first transparency report.” The first transparency report comes in 2021, and Pornhub says that much like social media platforms such as Facebook, Instagram and Twitter, they will be fully transparent about the content that should and should not appear on the platform.

Considering the changes seem inspired by social media platforms such as Facebook, Instagram and Twitter, it is important to look at their numbers as far as content moderation is concerned. In the latest transparency report for the period of July – September 2020, Facebook says they removed 12.4 million pieces of content for violating child nudity and sexual exploitation guidelines on the platform. As many as 1 million pieces of content were removed by Facebook owned Instagram, in the same period. Instagram also removed 13.1 million pieces of content marked as Adult Nudity and Sexual Activity while Facebook removed 36.7 million pieces of content. Twitter also reports, as per its latest data, that 264,652 accounts have been removed from the platform for Child Sexual Exploitation, in the period of July-December 2019.

It is perhaps very interesting to note that one of the current methods of becoming a Pornhub verified user is to send them a photo of yourself holding up a piece of paper with your username written on it.

As of now, Pornhub will only allow upload of new content from what they call content partners and people within the Model program. Which means random and unverified users can no longer upload videos on the platform. That will change next year, but only after a proper verification process is put in place. “In the new year, we will implement a verification process so that any user can upload content upon successful completion of identification protocol,” says Pornhub.

There is also an online petition to “Shut Down Pornhub and Hold Its Executives Accountable for Aiding Trafficking” which has already clocked 2,156,984 signatures at the time of writing this. In his scathing article in the New York Times, Nicholas Kristof details how videos of young girls and sexual violence have been uploaded on Pornhub. Even after they were reported, they continued to be circulated because copies of these videos had been downloaded by users, and in some cases, these were uploaded on other platforms as well.

Pornhub is also implementing what it calls fingerprinting technology to ensure, it claims, that the content that has previously and already been removed from the platform doesn’t get reuploaded and listed. There will also be a restriction on video downloads from Pornhub, except for paid downloads within the verified Model Program.

There will also be a new “Red Team” that will be put in place as an additional layer for content moderation. “The Red Team provides an extra layer of protection on top of the existing protocol, proactively sweeping content already uploaded for potential violations and identifying any breakdowns in the moderation process that could allow a piece of content that violates the Terms of Service. Additionally, while the list of banned keywords on Pornhub is already extensive, we will continue to identify additional keywords for removal on an ongoing basis,” says the platform.

Nonconsensual and unauthorized uploads can be a massive problem, particularly for underage girls, considering Pornhub’s numbers are huge. According to the platform’s own numbers released at the end of last year, Pornhub clocked 115 million visits per day globally with as many as 42 billion visits over the entire year 2019. That is more than Netflix and Amazon Prime Video, to name a few. In 2019, Pornhub also said that as many as 6.83 million new videos were uploaded on the platform in the year.

Read all the Latest News, Breaking News and Coronavirus News here

What's your reaction?

Comments

https://popochek.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!