Skip to content

New EU Rules for ‘Very Large’ Platforms Raise Concerns for Adult Sector

BRUSSELS — The European Commission — the executive of the EU — yesterday adopted the first designation decisions under the Digital Services Act (DSA), identifying 19 major platforms and search engines to be targeted for compliance with the controversial legislation.

Among the DSA clauses to which these “Very Large” (according to the new EC’s nomenclature) platforms and search engines will be subject, language about having to “mitigate risk” concerning “gender-based violence online and the protection of minors online and their mental health” has raised the concerned of legal experts and digital rights activists.

As XBIZ reported, the DSA has been widely criticized over privacy concerns as the EC attempts to tackle the issue of “illegal and harmful” content, including CSAM.

Last year, the European Commission justified its plan, contending that “the current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires.”

The new rules, the EC noted at the time, “will oblige providers to detect, report and remove CSAM on their services. Providers will need to assess and mitigate the risk of misuse of their services and the measures taken must be proportionate to that risk and subject to robust conditions and safeguards.”

Targeting ‘Very Large’ Platforms and Search Engines

Yesterday’s decision officially designated 17 Very Large Online Platforms (VLOPs) and two Very Large Online Search Engines (VLOSEs) that, according to the EC, reach at least 45 million monthly active users. These are:

The VLOPs are: Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Snapchat, TikTok, Twitter, Wikipedia, YouTube and German retailer Zalando.

The two VLOSEs are Bing and Google Search.

Following their designation, an EC statement explained, these companies “will now have to comply, within four months, with the full set of new obligations under the DSA.”

The new obligation, the EC declared, “aim at empowering and protecting users online, including minors, by requiring the designated services to assess and mitigate their systemic risks and to provide robust content moderation tools.” 

Under the subheading “Strong protection of minors,” the EC explained that “platforms will have to redesign their systems to ensure a high level of privacy, security, and safety of minors; targeted advertising based on profiling towards children is no longer permitted; special risk assessments including for negative effects on mental health will have to be provided to the Commission four months after designation and made public at the latest a year later; and platforms will have to redesign their services, including their interfaces, recommender systems, terms and conditions, to mitigate these risks.”

As part of their risk assessment, the targeted platforms will now have to “identify, analyze and mitigate a wide array of systemic risks ranging from how illegal content and disinformation can be amplified on their services, to the impact on the freedom of expression and media freedom. Similarly, specific risks around gender-based violence online and the protection of minors online and their mental health must be assessed and mitigated.”

The risk mitigation plans of designated platforms and search engines, the EC noted, “will be subject to an independent audit and oversight by the Commission. “  

Industry Attorneys Monitoring Developments

According to industry attorney Corey Silverstein of Silverstein Legal, the impact of the new designations and consequent obligations “could be substantial because many of the platforms that have been designated as VLOPs and VLOSEs are frequently utilized by the adult entertainment industry.”

Assuming these platforms decide to comply with the DSA, Silverstein told XBIZ, there may be major changes coming to what these platforms allow on their services within the EU.

“This could end up leading to major content moderation and outright blocking of adult content in the EU, including the blocking of websites that display adult entertainment from being listed in search results,” Silverstein warned. “Unfortunately, there is no definitive answer as to how these platforms will react but the industry will need to closely monitor this development.”

Free speech law expert Lawrence Walters, from the Walters Law Group, told XBIZ that the impact of the new designations on adult content creators “will depend on how the platforms and search engines implement the DSA requirements related to safety and security of minors, reporting of allegedly illegal content, recommendation systems, and advertising procedures.”

The new European requirements, Walters added, is likely to cause “increased friction between adult content creator accounts and these large platforms and search engines.”

Walters advised to keep an eye on the results of the first required annual Risk Assessment by the designated service providers, which will “provide valuable information on how they are responding to these compliance obligations and how that compliance impacts free expression.”

The attorney also noted that as the larger adult platforms continue to grow some may pass the EC’s 45-million-active-users benchmark, and thusly “face the potential for future designation under the DSA which could have more direct impact on their users and creators.”