As the work of the new Trade and Technology Council (TTC) gets underway to foster trade and extend technology ties, the U.S. and the European Union (EU) should use the summer to realize that they are following different paths to the same objectives. They must not get bogged down by distractions but should build on common ground and shared goals.
One glaring difference between these two great trading powers is contrasting conceptual approaches in key areas. Take data governance and technology platforms, an issue covered by one of TTC’s 10 working groups. The U.S. and EU approach varies on the three main issues: liability for third-party content, rules for content removal, and democratic oversight.
The U.S. governs its online space according to what may be termed a free-speech maximalist approach based on the First Amendment, which bans the government from introducing laws curtailing freedom of speech. According to Section 230 of the Communication Decency Act, online platforms have unconditional immunity over the content posted by third party users (except for copyright-protected material and that contravening federal statutes such as child sexual abuse or sex trafficking) and over the removal in good faith of any offensive or irrelevant content.
In the EU, online intermediary services must remove illegal content as soon as they learn about it — otherwise, their liability shield may fall. The EU Digital Services Act (DSA) tabled last December introduces a clear mechanism on how to do this but leaves the definition of illegality to EU member states unless it is already covered by other EU laws. One could compare the DSA to safety rules in a concert hall: you need to have enough emergency exits and fire extinguishers, but they don’t tell you what type of music to play. Despite concerns about the negative impact on freedom of expression online, some scholars have pointed to the DSA approach as a possible template for US reforms.
So how can the DSA meet its goals of safety and freedom without harming free speech or limiting innovation?
First, the DSA sets rules on transparency of content moderation decisions and gives tools to users to challenge decisions that affect them. Online platforms will need to introduce easy-to-use mechanisms allowing anyone to notify them about illegal content and give reasons why they remove it or not. But they will not be able simply to remove content or suspend accounts unless they clearly define such restrictions in their terms and conditions — and even then, such decisions will have to be balanced against the right to freedom of expression. And users will be able to challenge that: they can complain directly to the platform, choose an independent arbiter and go to court.
Second, the DSA gives an opportunity to independent researchers to access platforms’ data, to be able to understand the risks to society and fundamental rights from their services.
Third, DSA requires the biggest platforms, those de facto public spaces with more than 45 million monthly users, to regularly assess the risk of misuse of their services. They will bear much greater public scrutiny, will be subject to independent audits and regulatory oversight by national supervisory authorities, and, under certain conditions, the European Commission, supported by a Digital Services Board of member state supervisors.
The DSA will be born into a difficult digital world. Several EU countries have already implemented their own rules (and others may follow suit) which fragments the EU single market and undermines the scaling-up possibilities. As the European Parliament reviews and EU Member States discuss the DSA, it will be important that the proposed system maintains a consistent set of rules where no one can bend freedom of speech as they please.
In the U.S., discussions also continue about reforming the current regime. Many of the 40 bills introduced in Congress since the beginning of 2020 might, if enacted, produce unintended consequences for human rights and freedom of expression. For example, a general elimination of the platform’s safe harbor from liability for user content, as some bills envisage, would be at odds with DSA that keeps liability exemptions relatively broad, and clarifies obligations for different services in a proportionate manner.
What then can be done? The DSA could be a good starting point for the TTC working group to identify areas of common interest in tech governance. Discussions could focus on transparency of content moderation and accountability of outcomes to improve user experience online and increase public scrutiny. The U.S. and the EU could consider working together on the functional aspects such as notice and action systems (i.e. mechanisms to allow users to notify illegal content) and targeted access to data for the benefit of users everywhere, not only in their own jurisdictions.
Ultimately, both the U.S. and EU want an open, global and safe internet rather than a state-controlled internet, as is sought by China, Russia, or Turkey. Both protect freedom of speech and want their users empowered in the online space. Neither wants foreign interference in their democratic processes, the spread of disinformation, or the sale of illegal products. The U.S. and EU should work together to offer the world a human-centric alternative to autocratic and repressive regimes — and the way they use technology to pursue their agendas — as we enter a new post-pandemic era of digitalization.