Beheadings. Babies murdered. Bodies desecrated. X Hashtags promoting violence. Repurposed videos and videogame content, taken out of context or repurposed from other conflicts. Even a fake White House press release.

The outbreak of violence in Israel and the Gaza Strip unleashed a torrent of hateful and invented content on social media. Conflicts always challenge social media content moderators. What is new this time is a European law called the Digital Services Act (DSA), which came into effect in August and represents the strongest attempt in the democratic West to regulate social media. It obligates platforms to remove illegal content as well as online disinformation and other “societal risks.”

Mideast turmoil represents the DSA’s first big test. Although it is far from clear whether European regulators will succeed in stemming hate, they are unleashing their new legal weapon. European Commissioner Thierry Breton has opened an investigation into Elon Musk’s X, and reprimanded TikTok and Meta. Under the DSA, if the company fails to comply, it faces penalties of up to 6% of its global revenues.

Musk’s X faces particular scrutiny. Since changing ownership last year, it has slashed content and safety policy jobs and begun relying on volunteers to fact-check. Musk’s own recommendation of two accounts following the conflict has caused a stir, as they appear to have peddled disinformation. Experts are worried. “Deliberate decisions by @elonmusk have harmed war reporting at a critical moment. People on the ground will pay the price,” tweeted Atlantic Council’s Emerson Brooking.

Get the Latest
Sign up to receive regular emails and stay informed about CEPA's work.

In his defense, Musk says he is promoting free speech. After receiving the European Commission warning letter, Musk responded by asking Breton to “please list the violations.” You are “well aware of your users’ — and authorities’— reports on fake content and glorification of violence,” the Commissioner retorted. “Up to you to demonstrate that you walk the talk.”

The European Commissioner gave X’s owner 24 hours to “ensure a prompt, accurate, and complete response to this request.” X’s CEO Linda Yaccarino responded with a four-page note outlining how the company had been taking down new Hamas accounts, moderating in both Hebrew and Arabic, and removing or labeling tens of thousands of problematic posts.

Will European pressure succeed? The question remains open. If the Commission fines X, expect lengthy litigation.

The EU’s enforcement record is spotty. After it imposed the revolutionary General Data Protection Regulation (GDPR) in 2018, regulators failed to impose significant penalties until this year when it fined Meta EUR1.2 billion. That fine is set to be appealed.

The DSA itself poses specific enforcement challenges. Its text makes repeated reference to “illegal content,” “misinformation” (or “disinformation”) and “societal risk.” When invoking such vocabulary in content moderation, the Commission faces the tricky job of defining and explaining these terms.

News outlets have pointed to specific examples of disinformation proliferated on X in the wake of the attack. Breton’s public letters gave no specific evidence. Even the announcement of a formal investigation failed to provide details, mentioning only complaints about “illegal content, complaint handling, risk assessment and measures to mitigate the risks identified.”

If content is to be moderated on large platforms, the guidelines need to be clear, and the enforcement needs to be swift. Given the size of the problem, it’s a tough task. Expect conflicts between regulators and social media platforms to multiply. Even as Brussels targeted Mideast violence, the Polish government was warning against fake messages being sent out before the country’s upcoming election.

Clara Riedenstein is an intern at CEPA’s Digital Innovation Initiative. Bill Echikson is a non-resident CEPA Senior Fellow and editor of Bandwidth.

Bandwidth is CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy. All opinions are those of the author and do not necessarily represent the position or views of the institutions they represent or the Center for European Policy Analysis.

Read More From Bandwidth
CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy.
Read More