The negotiations were heated. They dragged on into the early morning hours. Behind closed doors, European leaders added sensitive, last-minute provisions designed to force Internet platforms to combat illegal content. While the goal is noble, online freedoms risk damage.

Human rights and civil society groups EFF, CDT, and Access Now have warned about the rush to seal a DSA deal in record time. In particular, they worry about a last-minute amendment calling for an “emergency mechanism.” It will give the European Commission broad powers to police how online platforms handle content in times of serious crises.

Under the agreed DSA, large Internet platforms are subjected to a series of vague obligations enforced without sufficient safeguards. Probably the most dangerous, in terms of impact on free speech, are for platforms to properly assess and mitigate “systemic risks.”

The definition of systemic risks needs clarification. It could include anything from a health crisis to a bloody war – or almost anything in between.  Platforms must define and adopt “reasonable, proportionate and effective” measures to mitigate these vague systemic risks.

Given the task’s complexity, these general principles lack clarity. How should the platforms comply? To avoid liability, they may be forced to remove legal content in excess. Platforms will be put in the position of deciding, under regulatory pressure, the best tools to deal with those negative yet ill-defined risks. They will be incentivized to restrict certain forms of speech.

Enforcement remains problematic. National “digital services coordinators” will be responsible alongside the European Commission for ensuring compliance. Yet no one knows how these coordinators will be designated and how they will operate. Will they have the legitimacy or experience to make comprehensive judgments regarding the desirable plurality of public discourse, the fairness of the electoral process, or the protection of public security? These matters stand at the core of our democracies and should be decided in open civil debates.

In our European model, non-elected bodies have no right to circumscribe free expression. Only an independent organization has that right. Yet the DSA gives unelected, politicized European Commission significant powers over Internet speech.

Two recent events bring these significant issues into sharp relief.

First, Europe’s top court recently validated the controversial Article 17 of Europe’s new Copyright Directive. It obliges online service providers to review content that users wish to upload before publication, to prevent copyrighted material from appearing without consent.  The Polish government considered this requirement incompatible with free expression. Polish Deputy Foreign Minister Konrad Szymanski said that the provision posed a “serious threat to freedom of expression” and could even lead to “the adoption of regulations similar to preventive censorship.”

But the court dismissed the Polish argument. It judged that online service providers receive information on copyrighted material from rightsholders, making it possible without danger o match and block when uploaded.

Second, consider Elon Musk’s proposed takeover of Twitter. Although it is difficult to anticipate the plans and down-to-earth practical decisions of the new billionaire owner, Musk has announced his intention to eliminate most internal content policies as the best way to guarantee unhindered speech on the platform.

European leaders have reminded Musk that the new DSA will apply to Twitter in Europe. They argue, with reason, that free speech is facilitated through content moderation; otherwise, the harmful speech will proliferate, turning them into potential cesspools inundated with spam, disinformation, harassment, misleading ads, and other types of legal-but-harmful content. Such a platform is not an attractive and safe space for online free expression.

Europe maintains that the DSA does not represent a violation of the right to freedom of expression but is a necessary instrument to promote online pluralism and free access to information. A similar yet not identical understanding can be found in the United States under Section 230. Court rulings so far interpreting it, have given platforms wide discretion to moderate – just look at the recent dismissal of President Donald Trump’s lawsuit against Twitter over the cancellation of his account.

Admittedly, there’s much good about the DSA. It represents a valiant attempt to deal with dangerous content online, reinforcing important and necessary duties of transparency, accountability, and redress. But we must be careful to make sure its enforcers do not overstep and curtail legal free expression. While protecting platforms’ content policies is a basic pre-condition for a safe online environment, legal imposition of certain types of content moderation may create serious perils for speech.

Joan Barata is an international expert in freedom of expression, media freedom, and media regulation. He has been the Principal Adviser to the Representative on Freedom of the Media at the Organization for Security and Cooperation in Europe.  Dr. Barata is an Intermediary Liability Fellow of the Program on Platform Regulation at the Stanford Cyber Policy Center.