It’s a tough question. How much leeway should a government be allowed to invade an individual’s privacy and messages to prevent crime?
In the physical world, the limits are clear: no democratic government is permitted to monitor citizens in their homes without a court order, even to prevent domestic violence or child sexual abuse. In the digital world, though, the answer remains unresolved. Child safety advocates believe that governments must be able to unlock private messages, while tech companies and privacy activists see a smokescreen for mass government surveillance.
This standoff is the hottest in the UK, which is debating a new Online Safety Bill designed to make tech platforms responsible for the content they host. Section 122 authorizes digital regulator Ofcom to force messaging apps to scan communications.
Tech protested. Messaging services Signal and WhatsApp threatened to pull their services from the country. They warned the UK’s decision would set a dangerous precedent, suggesting that “other jurisdictions will just copy-paste.”
In response, Arts and Heritage Minister Lord Stephen Parkinson recently reassured private messaging apps that they will not have to scan content until it becomes “technically feasible.” Experts suggest it could take years, if ever possible, for messages to be scanned without breaking encryption.
Tech officials applauded — to a point. Meredith Whittaker, the president of Signal, described the government’s move as “very big and very good,” but not a total victory. Will Cathcart, head of WhatsApp, said the company “remains vigilant against threats” to encryption.
How to balance online privacy and child safety frustrates regulators around the globe. In the US, the Senate’s proposed EARN IT Act would force social media platforms to break encryption. In the European Union, the Child Sexual Abuse Regulation would create a similar obligation.
Like in the UK, both laws generated a torrent of opposition. The EARN IT Act has been introduced three times, each time to no avail. In the EU, skeptical parliamentarians are proposing hundreds of amendments to the Child Sexual Abuse Regulation.
Consumers expect end-to-end encryption. It allows them keep digital conversation private, with the assurance that no third-party listens in. As of 2022, two billion people depend on encryption every day. The tech industry is moving toward more, not less, encryption. Messaging apps officials insist that it is key to remaining competitive.
While everyone agrees that children ought to be protected, company officials insist it is impossible to decrypt messages without infringing on civil liberties. In 2021, Apple launched a project to scan content in iCloud, only to scrap it over fears of compromising privacy.
Child protection groups insist that the tech companies are exaggerating. They argue that it is possible to scan child abuse while keeping the remainder of the content private. A YouGov poll suggests that a majority of UK adults would only support end-to-end encryption “if and when they can ensure child safety can be ensured.”
A possible compromise is to keep encryption while reinforcing efforts to protect children. Instead of risking privacy, policing should focus on prevention. The Technological University Dublin is leveraging AI to reveal the grooming techniques used by adults to lure and coerce children into sexually exploitative acts.
The UK decision to postpone — not bury — breaking encryption will fail to resolve the fierce global debate. It is merely “kicking the can down the road,” warns Matthew Hodgson, CEO of UK-based Element, which supplies end-to-end encrypted messaging to militaries and governments.
Clara Riedl-Riedenstein is an intern at CEPA’s Digital Innovation Initiative. She is an Oxford University graduate and an incoming MA student in political science at Columbia University. Bill Echikson is a non-resident CEPA Senior Fellow and editor of Bandwidth.
Bandwidth is CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy. All opinions are those of the author and do not necessarily represent the position or views of the institutions they represent or the Center for European Policy Analysis.