Russia’s new law tasking social media companies with finding and removing “illegal content” creates a dangerous precedent for other authoritarian — and democratic — regimes.

Although the Kremlin’s efforts to establish a sovereign internet and its clashes with international technology companies over adherence to local laws have garnered international attention, the country’s social media self-censorship law, which came into force on February 1, 2021, has gone almost unnoticed. This is a dangerous oversight.

Fierce debates are occurring around regulatory initiatives aimed at social media companies in democratic countries, such as the European Union’s Digital Services Act and the UK Online Safety Bill. The Russian example makes clear how imposing responsibility for policing online content upon platforms creates fertile ground for violating citizens’ rights.

Russia’s new social media law requires platforms with more than 500,000 daily users in Russia to proactively identify and remove content prohibited under Russian law. This includes materials such as child pornography and other information “inciting minors to illegal or dangerous activity.” It also encompasses content deemed to be “harmful or obscene”, or disrespectful of the government, state symbols, and government officials. Platforms are expected to find and block “calls to mass disorder, extremism, terrorism, and participation in unsanctioned public events.”

If social media platforms fail to comply, they risk massive fines (up to one-tenth of their annual revenue) and being blocked. Until the new law came into effect, Russian internet regulator Roskomnadzor submitted removal requests to the platforms and had to provide evidence that the content in question violated the law. The new model bypasses this extra step, placing the burden of determining whether a post is potentially illegal on the platforms themselves.

The shift is problematic. It delegates the censorship function to platforms while reinforcing state control over online expression. Available definitions of what counts as “illegal” are vague, especially when it comes to political speech. This puts social media companies in a Catch-22 situation as it is virtually impossible for them to adhere to the new rules.

In February 2021, for instance, Roskomnadzor demanded the removal of news stories covering flashlights and candle vigils in support of jailed opposition leader Alexey Navalny, claiming these reports “contained calls to mass disorder and unsanctioned mass rallies.”

According to Google’s Transparency Report, Russia accounted for 60 percent of all content removal requests to the corporation over the past decade, with government bodies submitting a record 123,000 requests to remove over 950,000 pieces of content, mostly from Google Search and YouTube. Facebook and Twitter have seen a similar dynamic.

Given this flood of requests, platforms might anticipate having to remove significant numbers of content in Russia. Combined with the vague criteria for what qualifies as illegal or harmful, this will encourage platforms to over censor. It no longer appears a viable strategy to ignore take-down demands, as platforms have done in the past.

Moderating content containing humor, sarcasm, or irony (as is often the case with political speech) is notoriously difficult. Platforms have acknowledged this task becomes even more challenging when linguistic and cultural barriers are at play. Social media companies have a poor track record in taking content moderation seriously. Ultimately, Russian social media users — who have few rights of redress when their content or accounts are blocked — stand to lose.

If we accept that Russia’s ultimate goal, as part of its sovereign internet strategy, is to push foreign social media companies out of the country, the new legislation gives them the pretext. These political decisions pose a threat not only to Facebook, YouTube, or Twitter but also to smaller platforms that are emerging as spaces for political discourse, such as Telegram and TikTok.

The delegation of responsibility for policing ‘harmful’ speech to platforms is problematic, and not just in Russia. While the new law gives the Kremlin more leverage over foreign platforms, their market share is limited. But in countries where these platforms are dominant, they wield significant influence. What would be the consequences should governments pressure them to moderate content — and hold them liable for mistakes, regardless of scale, ambiguity, or conflicts with international human rights norms?

Platforms shouldn’t be the sole arbiters of free speech and online expression, as these norms should be set through an inclusive process of consultation. Yet neither should they tow the state line under government pressure (as happened with Facebook when it censored “anti-state” content in Vietnam). A growing number of incidents demonstrates that platforms seek to “minimize political fallout” rather than prioritize free expression and transparency. The threat of large fines further incentivizes restricting free speech.

Platforms wield too much power in shaping the norms of online expression. But we must also hold to account the legislators and regulators, in both autocracies and democracies, to make sure the legal frameworks governing online spaces are “clear and accessible to everyone” and empower citizens instead of serving as political tools. These norms should allow users to flag harmful speech, but also to provide clear remedies that ensure a balance of power between citizens, platforms, and states — regardless of the regime.

Tanya Lokot is an Associate Professor in Digital Media and Society at the School of Communications, Dublin City University. She researches threats to digital rights, networked authoritarianism, internet freedom, and internet governance in Eastern Europe.

Mariëlle Wijermars is Assistant Professor in Cyber-Security and Politics in the Faculty of Arts and Social Sciences, Maastricht University. She conducts research on algorithmic governance, media freedom, and the human rights implications of Internet policy.