Skip to content

FSC Europe Starts Petition to Protect Rights of Sexuality Professionals

BERLIN — FSC Europe, the recently founded European chapter of industry trade group Free Speech Coalition (FSC), has released a petition and a statement calling on the European Commission to “protect the rights of sexuality professionals, artists and educators” while drafting the E.U.’s new Digital Services Act.

As the European Union prepares to implement a new set of rules to govern digital services platforms and protect the user’s online rights, the Free Speech Coalition is petitioning for sex workers and sexuality professionals, artists and educators not to be excluded or discriminated against any longer. The organization proposes 10 steps to a safer digital space that protects the rights of those working within the sphere of sexuality — already some of society’s already most marginalized people — and to demand transparency from the platforms, which, as the European Commission itself states, “have become integral parts of our daily lives, economies, societies and democracies.”

Sexual Expression is Being Banned Online

Sex in almost all its guises is being repressed in the public online sphere and on social media like never before. Accounts focused on sexuality — from sexuality professionals, adult performers and sex workers to artists, activists and LGBTQI folks, publications and organizations — are being deleted without warning or explanation and with little regulation by private companies that are currently able to enforce discriminatory changes to their terms and conditions without explanation or accountability to those affected by these changes. Additionally, in many cases it is impossible for the users to have their accounts reinstated — accounts that are often vitally linked to the users’ ability to generate income, network, organize and share information.

Unpacking the Digital Services Act (DSA)

At the same time as sexual expression is being erased from digital spaces, new legislation is being passed in the European Union to safeguard internet users’ online rights. The European Commission’s Digital Services Act and Digital Markets Act encompass upgraded rules governing digital services with their focus, in part, being building a safer and more open digital space. These rules will apply to online intermediary services used by millions every day, including major platforms such as Facebook, Instagram and Twitter. Amongst other things, they advocate for greater transparency from platforms, better protected consumers and empowered users.

With the DSA promising to “shape Europe’s digital future” and “to create a safer digital space in which the fundamental rights of all users of digital services are protected”, it’s time to demand that it’s a future that includes those working, creating, organizing and educating in the realm of sexuality. As we consider what a safer digital space can and should look like, it’s also time to challenge the pervasive and frankly puritanical notion that sexuality — a normal and healthy part of our lives — is somehow harmful, shameful or hateful.

How the DSA Can Get It Right

The DSA is advocating for “effective safeguards for users, including the possibility to challenge platforms’ content moderation decisions.” In addition to this, the Free Speech Coalition Europe demands the following:

  • Platforms need to put in place anti-discrimination policies and train their content moderators so as to avoid discrimination on the base of gender, sexual orientation, race, or profession — the same community guidelines need to apply as much to an A-list celebrity or mainstream media outlet as they do to a stripper or queer collective;
  • Platforms must provide the reason to the user when a post is deleted or account is restricted or deleted. Shadowbanning is an underhanded means for suppressing users’ voices. Users should have the right to be informed when they are shadowbanned and to challenge the decision;
  • Platforms must allow for the user to request a revision of a content moderation’s decision, platforms must ensure moderation actions take place in the users’ location, rather than arbitrary jurisdictions which may have different laws or custom; e.g., a user in Germany cannot be banned by reports and moderation in the Middle East, and must be reviewed by the European moderation team;
  • Decision-making on notices of reported content as specified in Article 14 of the DSA should not be handled by automated software, as these have proven to delete content indiscriminately. A human should place final judgement;
  • The notice of content as described in Article 14.2 of the DSA should not immediately hold a platform liable for the content as stated in Article 14.3, since such a liability will entice platforms to delete indiscriminately after report for avoiding such liability, which enables organized hate groups to mass report and take down users;
  • Platforms must provide for a department (or, at the very least, a dedicated contact person) within the company for complaints regarding discrimination or censorship;
  • Platforms must provide a means to indicate whether you are over the age of 18 as well as providing a means for adults to hide their profiles and content from children (e.g. marking profiles as 18+); Platforms must give the option to mark certain content as “sensitive”;
  • Platforms must not reduce the features available to those who mark themselves as adult or adult-oriented (i.e. those who have marked their profiles as 18+ or content as “sensitive”). These profiles should then appear as 18+ or “sensitive” when accessed without log in or without set age, but should not be excluded from search results or appear as “non-existing”;
  • Platforms must set clear, consistent and transparent guidelines about what content is acceptable, however, these guidelines cannot outright ban users focused on adult themes; e.g., you could ban highly explicit pornography (e.g., sexual intercourse videos which show penetration), but you’d still be able to post an edited video that doesn’t show penetration;
  • Platforms cannot outright ban content intended for adult audiences, unless a platform is specifically for children, or >50% of their active users are children.