At the beginning of the COVID crisis, a Facebook page played a positive role in helping my synagogue's community survive when in-person services were canceled. Then the conversations turned toxic.
When a whistleblower handed over documents incriminating Facebook for monetizing misinformation and anger, my thoughts turned to my synagogue. Two decades ago, my wife and I created the International Jewish Center in Brussels. Since then, it has grown to more than 200 members, with a minimum of discord. We concentrated on growing Progressive Judaism in Belgium and avoided politics, particularly Mideast politics.
Then the community created a Facebook page.
After the latest Gaza War erupted earlier this year, the page turned violent. Left-wingers called Israel an apartheid state. Zionist members shot back comments about self-hating Jews. Some disgusted members quit. The board shut down the page - and the left-wingers left to create their own community.
The split highlighted to me how Facebook conversations often veer into destructive anger-driven conflict - and explains among other things why regulators are pushing for reform. Among other changes, regulators are considering demanding that the social media company open up its algorithms and increase its responsibility and limit its incentive to spread hate.
The immediate changes look set to take place in Europe, starting with my synagogue’s hometown Brussels. Although whistleblower Frances Haugen first testified to the US Congress, she has already spoken with European Commissioner Thierry Breton and is planning to visit Europe. She is appearing before the European Parliament on November 8.
European lawmakers are listening. Haugen’s revelations are feeding into two pieces of potentially landmark legislation, the Digital Services and Digital Markets Acts, which are being fast-tracked through the Brussels law-making machinery.
Just as the US boasts Section 230 of its communications code, Europe has its e-commerce directive. Both laws were enacted at the birth of the Internet. They set clear limits on liability for digital platforms. Platforms aren’t held responsible for content uploaded to their sites. Instead, they are responsible only for bringing down illegal material when informed.
The Internet no longer is a baby in need of nurturing – and both the US and Europe are updating these foundational limited liability rules. Europe’s proposed DSA and DMA already are now making their way through the Brussels approval process. Together, the new European rules could set a global precedent - regulators everywhere from Tokyo to New Delhi are watching their progress: much as Europe’s GDPR provides global standards for data protection, European policymakers aim to set global standards on illegal content.
As the debate over Facebook demonstrates, it’s a tough task. Rules reigning in hate speech must be balanced against concerns hindering free expression. In the European Parliament, parliamentarians have focused on pressing e-commerce marketplaces to curb illegal counterfeits and dangerous products, while, in comparison, tiptoeing around speech platforms.
Post the Facebook files revelation, this could now change. The DSA would force platforms hosting user-generated content to share details about their algorithms and content moderation. Facebook and others will be forced to conduct annual risk assessments about their spread of misinformation and hate. Large potential fines could be fined for violators.
The DMA is meant to reign in the largest platforms, including Facebook. While a debate remains open on whether the social media giant is dominant, regulators argue that network effects lock in the dominance of certain platforms which can sometimes act as gatekeepers. This creates hurdles for competitors trying to enter the same market. Under the DMA proposal, the gatekeepers would have their actions and acquisitions restricted, their inner workings opened up through a multitude of transparency obligations and their algorithms examined in detail.
Would these restrictions help prevent my synagogue’s implosion? I’m not sure. At the beginning of the COVID crisis, the Facebook page played a positive role in helping the Community survive when in-person services were canceled. But when the conversations turned toxic, the community’s board decided to shutter the site.
Recently, just as the whistleblower’s explosive files emerged, the page went back online. This time, it is moderated. Politics are forbidden. And guess what? Almost no one uses or reads it.
Bill Echikson edits CEPA’s Bandwidth section. Before joining CEPA, he worked for Google as a communications senior manager and for Dow Jones as Brussels bureau chief.
November 8, 2021