Hewitt is the President and Executive Director of the Lawyers’ Committee for Civil Rights Under Law. He spoke with CEPA after the meeting. The transcript below has been edited for clarity.
What happened at the listening session? What was your perception of it? What do you think of the principles that came out of it, and how did the Lawyers’ Committee influence that work?
Civil rights should not abate just because you enter an online environment. Rights that apply in in-person interactions should also apply online. That is our basic principle.
At the White House, I lifted up a couple of pieces. The first is that we believe privacy rights are civil rights. In recent years, we have seen too many actual instances of people of color being surveilled, whether it be by the state, white supremacists, or anti-abortion protestors. The Internet, and certain tech platforms, in particular, function as a virtual “superhighway” for connecting, transferring, and spreading white supremacist ideology.
An internal memorandum shows how Facebook (Meta) understands that its own platforms cause a problem by using algorithms to connect people with white supremacist ideology. We also know that Google, which owns YouTube, is known to be a haven for white supremacist content. Platforms must stop being a superhighway for connecting white supremacists with each other and connecting others to their racist ideology.
How were these messages received? What do you think is the evolving perception of the Biden administration on these issues?
I was the only civil rights voice in the room. There were people from big tech, but forward-looking big tech, as well as other experts and advocates.
Number one, I sensed a real curiosity in understanding and diagnosing problems. I was able to lift up some of the commonly reported things we’ve seen like crime prediction software algorithms that overpredict the “dangerousness” of Black and Latinx people and underpredicts that of white people – it’s like magnifying and weaponizing private biases and stereotypes, even if it’s unintentional. They were also interested in what the policy prescriptions could be, and I referenced the American Data Privacy and Protection Act (ADPPA) currently being debated in Congress.
We were very pleased with the principles that came out of the session. There was a strong emphasis on privacy, and while the ADPPA was not explicitly named, administration representatives mentioned that there is strong bipartisan interest in privacy legislation.
A few days later at the White House’s United We Stand Summit, President Biden’s remarks went exactly where we want to hear him go. He talked about tech platform accountability as critical to not only a whole-of-government response, but a whole-of-society response to stopping white supremacist ideology.
The question is can we get past the tipping point and get to meaningful policy adoption and implementation of said policy. It is a test of political will and muscle, at a time when the nation and Congress are polarized – a time in which there is nearly bipartisan consensus but there is still relative stalemate because of so many competing priorities.
How would you reform the Section 230 rules which give tech companies broad liability freedom?
Section 230 as it stands does not – and should not – shield tech platforms from all accountability. We believe that decisions that the platforms themselves makes, such as which ads to put in an algorithm and which content to amplify, are not protected by Section 230. It only provides a shield to liability when it’s for the actions of a third party that its being held liable.
As far as the SAFE TECH Act and the elimination and abrogation of 230, I know it’s a complicated thing. The Act is seeking to bar platforms from evading responsibility for claims made online that result in real-world harms. Some have raised fears that Black Lives Matter activists could somehow find themselves subject to liability if the Act passes.
What is important is to balance the interests. In my view, freedom and privacy are not inherently fraught with tension. The right to free speech should not trump civil rights. We need to strike a balance between freedom and privacy, but the balance doesn’t mean 50/50. That balance, to me, means that anyone’s speech – individuals, civil society, corporations – cannot trump civil rights and liberties.
In Europe, there are similar concerns such as moderating hate speech in Germany and the Digital Services Act removing some forms of intermediary liability. Are you taking inspiration from them?
There should be a multilateral approach because the Internet knows no international borders. Nor does hate. If the US has been bombarded by Russian bots to influence racial justice movements as well as elections, the sources of racially targeted content are not always domestic.
We are growing in appreciation for the parallels. I have been interested in looking at how, after Brexit, the EU standard for GDPR may not track the UK standard.
I’m interested in what the iterations and variations may be, because if there can be some common baselines globally, that is to everyone’s benefit.
Do you have one last message for US policymakers?
If you believe that our democracy is under attack or at stake from things like voter suppression, the January 6th insurrection, and mass shootings, you should also be worried about what is happening on the Internet and at least be open to more regulation and clear guard rails.
Damon T. Hewitt is the President and Executive Director of the Lawyers’ Committee for Civil Rights Under Law. Hewitt has more than 20 years of civil rights litigation and policy experience in the fight for racial justice, including prior leadership roles in the nonprofit, philanthropic, and public sectors.
Bandwidth is CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy. All opinions are those of the author and do not necessarily represent the position or views of the institutions they represent or the Center for European Policy Analysis.