Imagine seeing a video that appears to feature you engaging in intimate sexual acts. Yet it’s not you. Your face is meticulously mapped onto someone else’s body in a disturbingly realistic fashion. Welcome to the realm of deep fake porn, a malicious fusion of technology and exploitation that is emblematic of a large issue: online sexual abuse.

Artificial intelligence has taken this problem to new heights, expanding the tools available to perpetrators. According to a UNESCO recent study, 58% of young women and girls around the globe have experienced online harassment on social media platforms. Victims often endure immense distress and anxiety, reputational damage, trauma, and in some cases, even harassment or blackmail.

Amid this alarming surge, digital platforms—often the battleground where such content thrives—have fallen short. So have regulators. While new laws toughen rules, enforcement remains spotty. Victims find themselves at the mercy of algorithms that fabricate, disseminate, and make viral their compromising material.

Terminology is important. Although commonly labeled as “revenge porn,” the dissemination of private sexual imagery extends far beyond cases of revenge and is not always porn. It includes the private sharing, without consent of intimate imagery. This could happen to you, me – anyone, not just porn actors.

Companies do make efforts to detect and remove this abusive content, but the sheer volume overwhelms them. Meta, owner of Facebook, Instagram, and WhatsApp, and other companies including X, Snapchat, Telegram, and even porn platforms, are reactive rather than proactive.

Encrypted messaging apps such as Telegram and WhatsApp present additional challenges. These apps allow large private groups to form. Their privacy protections hinder effective monitoring. The upshot? The apps become breeding grounds for the distribution of private sexual images.

Get the Latest
Sign up to receive regular emails and stay informed about CEPA's work.

Porn sites present a particular danger. The ease of sharing images and videos exposes individuals to exploitation, allowing YouPorn and others to become conduits for non-consensual sex.

Robust regulation is required. Progress is difficult but possible. I know firsthand. Four years ago, I led a political campaign called #IntimitàViolata in Italy advocating the criminalization of online sexual abuse. In a country with a deeply patriarchal and sexist tradition, many objected. Yet, at the end of 2019, Italy passed legislation punishing “anyone who sends, delivers, gives, publishes or disseminates images or videos of sexual organs or sexually explicit content, intended to remain private, without the consent of the persons represented.” Offenders face prison terms of up to six years and a fine of up to €15,000.

Other nations are also acting. More than 48 US states, along with Washington, DC, Canada, Australia, Mexico, the United Kingdom, Germany, Spain, France, and Malta, have criminalized the creation and distribution of non-consensual intimate imagery.

However, these legislative advancements remain insufficient. While the EU recently passed revolutionary digital content rules in its Digital Services Act, it missed an opportunity to enforce the removal of sexual videos uploaded without consent. The just-approved European AI Act fails to provide deep fake porn victims with tools to claim justice for harm from generative AI technology.

Regulators have made a mistake by concentrating on the large social media platforms, ignoring porn sites. A coalition of digital rights activists, sex workers, and gender-based violence survivors recently signed an open letter calling on European regulators to include major porn platforms in the Digital Service Act’s list of Very Large Online Platforms, which face the most severe scrutiny. Penalties for breaching the regime can reach up to 10% of global annual turnover — or 20% for serial repeaters.

Alongside tough laws and enforcement, educational and awareness campaigns must take center stage. Victims must be taught the skills to recognize and report instances of image-based sexual abuse. The longer we delay, the more challenging it will become to address the repercussions of violence. The time to act is now.

Silvia Semenzin is a PhD in Digital Sociology working at the University Complutense of Madrid, IBSA advocate, podcaster, and author of the book “’Women are all Whores’: Revenge Porn and Hegemonic Masculinities” (2021).

Bandwidth is CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy. All opinions are those of the author and do not necessarily represent the position or views of the institutions they represent or the Center for European Policy Analysis.

Read More From Bandwidth
CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy.
Read More