Dark patterns have emerged as a major issue on both sides of the Atlantic. In Europe, the European Parliament has called for a ban in the Digital Services Act (DSA), while in the US, lawmakers at the state and federal levels are experimenting with proposals to end the practice.

But these efforts face a major challenge: how to define dark patterns. European and US policymakers generally agree that deceptive user interfaces should be outlawed. Yet what is the line between legitimate marketing and a dark pattern? No clear answer exists. The challenge underlies a broad issue in tech regulation, developing rules for rapidly evolving technologies.

Consider another example – how to count a service’s “active users,” the metric proposed to determine the level of compliance required under Europe’s upcoming DSA. Is a Facebook user examining their Newsfeed the same as an Amazon buyer purchasing a product? Should shoppers who browse but buy nothing be counted? How can overcounting be avoided when the same user uses different devices, a computer, tablet, and mobile phone? Officials struggle to provide answers.

On dark patterns, everyone agrees that psychological tricks, deceits, and manipulations should be outlawed. The question is what constitutes illegitimate tricks.  Imagine if you subscribe to a publication online, with an offer of a special price, only to later find it almost impossible to cancel the subscription? What if one receives a free trial, but then faces unexpected charges after the trial ends? Is Amazon’s One-Click purchase system a Dark Pattern? Or does it provide a legitimate customer benefit? Is adding your Facebook profile easier and faster to create your UberEats account or are you granting additional data for the companies’ interests?

Recent research underlines this definitional challenge. “Dark patterns tend to exhibit disparities in the evidence of their prevalence,” concludes an Organization for Economic Co-operation and Development (OECD) report. We need to “gather further evidence.”

On both sides of the Atlantic, privacy legislation already attempts to tackle the issue. California’s Consumer Privacy Rights Act (CPRA), bans the sale or sharing of personal information obtained through manipulative user interfaces. At the federal level, the proposed Deceptive Experiences To Online Users Reduction (DETOUR) Act would would forbid large online platforms for using dark patterns to obtain consumer data.

Across the Atlantic, Europe’s landmark General Data Protection Regulation (GDPR) is designed to protect consumers from misuse of their personal data. But it lacks an explicit focus on dark patterns.  National regulators have filed individual cases.  Early this year, the French Data Protection Authority fined Facebook, YouTube, and Google more than €200 million for tricking consumers to accept cookies.

The final DSA, still under negotiation, looks set to include a broad ban on dark patterns. But negotiations still have not produced a clear definition. A so-called delegated act, to be elaborated in coming years, may be required.

Action should focus on a few clear instructions. Platforms should be required to make it easy to opt out, delete an account, or limit how much data they collect. It would be helpful, too, if similar rules are imposed on both sides of the Atlantic Ocean. Subjects such as dark patterns should be included on the agenda of the upcoming Paris meeting of the Trade and Technology Council.

Maricarmen Martinez is a Digital Policy intern at CEPA. Previously, she worked at the US Embassy in Belgium and the US Mission to the European Union.

Gabriel Delsol is a Program Assistant with the Digital Innovation Initiative at CEPA.