What are some great alternatives to Pornhub

Pornhub isn't the only porn platform that has benefited from sexual exploitation

A debate about Pornhub has been raging for weeks: after allegations that the pornography platform was not doing enough against the dissemination of abusive content, the payment service providers Visa and Mastercard decided to end their cooperation with the company. Pornhub responded promptly and deleted all the videos that were not verified - a large part of the content on the platform, a total of ten million videos, was removed in one fell swoop. In the future, only verified content should be allowed.

The impetus for this was a column in the "New York Times", which described the fate of several victims whose abuse could be seen on the platform. Years later, it is still difficult for them to get rid of the content because it has been downloaded and redistributed.

Tightrope walk

The newspaper criticized that, on the one hand, it was far too easy to use the search field on Pornhub to find content that shows child abuse or rape. Furthermore, it is often not even possible for moderators to be able to assess whether the persons shown are around 16 years old, 17 or already of legal age.

In the case of rape videos, it is often a tightrope walk to assess whether it is a posed video or real abuse. Revenge pornography or secret recordings also remained standing for far too long - by the time they are removed, they have often already been clicked hundreds or thousands of times, which means that they can be downloaded and redistributed on other sites.

Problem with all porn sites

But Pornhub is far from alone with this problem. Like social media, the site has previously allowed any user to upload content without having to go through a verification process beforehand. This is no different with most of the other major pornography platforms - they too allow all uploads. Pornhub belongs to the billionaire Mindgeek, which includes Youporn, Xtube and Redtube, many of the most popular pornography platforms that are operated in the same way as Pornhub - and therefore have to struggle with the same problems.

XHamster, also a top dog in the industry, has been criticized again and again in the past: For example, women were secretly filmed in a sauna at the end of 2019 and their recordings ended up on the porn platform. Some of them stayed online for two weeks despite being reported, XHamster initially referred to its own terms of use. In the following months, clandestine recordings of ignorant women surfaced in Dixi toilets at festivals, including the Fusion Festival.

USA responds with registration requirements

The US Senate responded to the Pornhub allegations with a plan to introduce an ID requirement for uploaders. In addition, according to the project submitted by two US senators, signed declarations of consent from all actors, from which full names can be read, are to be obtained and left on the respective platform. So far, this has only been a requirement of pornography producers, but not of the hosting platform on which content is uploaded.

Such an approach would need an impact assessment for the general public, warns the fundamental rights activist Thomas Lohninger from the NGO Epicenter Works. "The burden of justification lies with the legislator," he says of the STANDARD - who would have to justify why such a restriction is acceptable to the general public. In addition, pornographic content is not always distributed for financial reasons, but rather that some users live out their sexual preferences in this way as a hobby. These would be exposed for no reason.

Maximilian Schubert, general secretary of the ISPA provider association, which in turn runs the Stopline.at reporting office against child abuse, re-activation and extremism, also points out that a very high level of data security must be guaranteed, since actors in the pornography industry in particular are often exposed to stalkers, for example .

The question of data security could become crucial, because the handling of pornography platforms with user data is already questionable - so the Institute for Digitization and Innovation in Law of the University of Vienna comes to the conclusion in its podcast Ars Aequi that Pornhub has basic GDPR rules disregards its terms of use. Pornhub is actually based in Cyprus and would therefore have to adhere to the General Data Protection Regulation (GDPR) and ensure a similar level of protection as, for example, Facebook. But while social media are repeatedly criticized and held responsible for this, this is not the case with pornography sites. In the data protection regulations, for example, collected data is not counted as personal information - which is simply not true.

Non-transparent moderation

Basically, the fact that this type of content remains is probably a problem in content moderation: porn platforms have so far been very opaque with regard to their practices - only recently XHamster came under criticism because the control of user content is partially outsourced to volunteers . In response to the criticism of the past few weeks, Pornhub has announced that it will strengthen moderation and also disclose this on the basis of regular transparency reports - an independent law firm will be hired to check the handling of illegal content.

Basically, platforms usually rely on two components to control user-generated content: on the one hand, human moderators and, on the other hand, automated systems. Specifically, they save digital fingerprints of content that has already been deleted in a database, and the content in it is then compared with uploads. If there is a hit, the content is blocked.

Cooperation with authorities

Such systems are usually used primarily in the pursuit of child abuse images. Large social media in particular cooperate voluntarily with authorities such as Interpol, which maintain such hash databases in which content that has already been identified is made available for comparison. Most major pornography platforms should be no different - after all, they have a financial interest of their own in removing such material. Pornhub confirms this in response to a STANDARD request, and the company also refers to partnerships with dozens of non-profit organizations.

As the press office of the Federal Criminal Police Office says, there are occasional contacts with pornography platforms, but there are no special collaborations. In comparison, the authorities have a better working relationship with the major social media outlets from the USA (more on this here).

No special set of rules

But while child abuse in particular is strictly prosecuted - also due to the rigorous pressure from authorities - the regulation of other sexual abuse such as rape or human trafficking remains open. There are no specific rules for moderation.

In Austria, the implementation of the EU guideline for audiovisual media services has been in effect since January 1st - this also stipulates that certain illegal content must be deleted for video sharing platforms, including pornographic depictions of minors, as the lawyer Gabriela Staber from the CMS office explains. These must be removed immediately after notification. These rules already apply if a company of the group of companies is established in an EU member state. This is to prevent large platforms from exempting themselves from the scope of the guideline due to a complex group structure. However, large companies usually move their headquarters to those member states that implement regulations particularly laxly.

The legislative package against hatred on the Internet that has been in force since the beginning of the year and implemented by Turquoise-Green does not apply - on the one hand because it does not affect videos, on the other hand because the legal validity of the Communication Platforms Act is generally questionable due to possible violations of the e-commerce directive.

Digital Services Act is pending

A potential set of rules could, however, be provided by the planned Digital Services Act (DSA) of the European Union: The comprehensive project also provides specifications for the handling of user content. These apply above all to particularly large platforms, namely those that have more than 45 million users in the EU. Pornhub is likely to be affected by the rules - after all, the platform has around 3.5 billion visits a month - and thus more than, for example, Amazon or Netflix; the parent company Mindgeek anyway due to its numerous high-access subsidiaries.

A mandatory reporting system is planned with which users can easily report illegal content. In order to avoid over-blocking, i.e. the removal of permitted content, the creators of deleted posts should be able to defend themselves if they do not agree to the deletion. In addition, so-called "Trusted Flaggers" are to be nominated - these can be authorities or organizations - whose reports are given priority. Furthermore, authorized delivery agents must be placed in the respective countries.

Transparency obligation

The DSA also provides special rules for platforms that are particularly large, for example they have to submit information to their own supervisory authorities every six months. Their approach and compliance with EU requirements are also to be checked on the basis of independent audits. Inspectors should have access to "all relevant data" and pass on their findings directly to the commission, which should also have access. It will take some time before the rules actually come into force - because the Commission, Council and EU Parliament must first come to an agreement, and extensive lobbying is to be expected.

So far, the government only wants general restrictions

In contrast to social media such as Facebook or Youtube, the debate about the regulation of problematic content on pornography platforms, especially from the political side, has hardly been held in Austria in recent years. Previous efforts, often motivated by religious concerns, dealt with the general restriction of access to pornographic content, but not with moderation requirements.

The human rights spokeswoman and ÖVP member Gudrun Kugler is one of the greatest advocates of a pornography filter that is supposed to block visits to such websites by default. Such systems were planned in the previous turquoise-blue government, but the turquoise-green government program provides for voluntary protective filters for parents. (Muzayen Al-Youssef, 3.1.2021)