The UK Post Office service contracted Japanese multinational Fujitsu to provide accounting software in 1999. Pervasive bugs produced faulty accounts and prosecutors pursued fraud charges against several thousand post office branch directors, wrongfully incarcerating hundreds. The Post Office and Fujitsu kept the bugs from the court.

Fujitsu and the Post Office leadership’s decision to ignore and even suppress these facts represents a potentially criminal failure – and requires urgent remedy. But punishing Fujitsu and the Post Office remains insufficient. New policies are needed to require companies to report the bugs, and submit their findings to independent third-party checks.

Current software disclosure requirements concentrate on cybersecurity dangers, not bugs. The US Securities and Exchange Commission has rules requiring public companies to disclose cybersecurity risks; they don’t extend to disclosing bugs. In the European Union, a Cyber Resilience Act under consideration focuses on the disclosure of cybersecurity vulnerabilities.

The US, EU, and UK have adopted a “bug bounty” program for some government-run systems, offering cybersecurity groups money in exchange for identifying and disclosing dangers in government technologies. But bug bounty programs are agreements between the contractor and client, so there is no need to disclose these things to a third party. In the Post Office scandal, no one disclosed the bugs.

Media outlets reported on the failures of the Fujitsu system and were ignored. An independent report conducted by auditing firm Second Sight from 2012 to 2015 exposed software failures. It too was ignored. In testimony to the inquiry corroborated by internal emails, the company’s developer Gerald Barnes raised a range of issues, including a bug that caused accounting errors in the end-of-day tabulation. Postmasters couldn’t spot this error which explained the accounting discrepancies for which many were investigated and prosecuted.

Bugs are a normal and inevitable feature of software development. Although companies who develop these technologies have a responsibility to limit them, bug-free software is unrealistic. Contractors provide ongoing servicing to address bugs as they are spotted. Similarly, cyber-security threats evolve. Companies must check for new problems.

That is what makes transparency important. Companies often do not want to admit bugs. A bug is a technical failure. While many contracts mandate reporting of bugs (as they should), companies are worried that disclosing or drawing attention to them will create a poor perception of their work and harm their relationship with their clients.

Get the Latest
Sign up to receive regular emails and stay informed about CEPA's work.

Fujitsu and the Post Office broke their legal obligations to provide evidence to the defense council for sub-postmasters. That’s a moral and legal failure, not a policy failure. But the policy can be improved too. Companies should be incentivized to debug and to ensure transparent, third-party testing of their work.

Incentivizing debugging isn’t just about motivating contractors to address bugs. It’s also about incentivizing them to acknowledge the debugging. In many cases, companies act in good faith, find bugs, and decide to quietly fix them. They think the ideal approach is to catch and fix the problem before their client notices. If they do, and the client doesn’t know, then no harm is done, right?

While a tidy rationale, it’s wrong. If Fujitsu had fixed some of the bugs without disclosure to the Post Office, branch directors with accounting deficits would not have escaped prosecution. Fujitsu and the Post Office had remote access and could change software and even numbers in the system, but the branch officers remained blind.

Laws must require reporting bugs and visible third-party audits. The EU Cybersecurity Agency ENISA has made recent proposals mandating checks, but only requiring that they be handled internally rather than by a third party. Independent third-party auditing should become mandatory.

Policymakers should be careful in implementing fixes. Tech companies and associations worry that the proposed EU’s Cyber Resilience Act requiring vulnerability disclosures could increase access to software, increasing safety risks rather than reducing them.

However, these concerns center on cybersecurity risks rather than bugs. That’s a problem. Hackers didn’t break into the Fujitsu UK Post Office software. The problem was internal, landing innocent people in jail. Companies should be forced to come clean.

Joshua Stein recently completed a postdoctoral fellowship at the Georgetown Institute for the Study of Markets and Ethics. His work focuses on ethics, technology, and economics.

Bandwidth is CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy. All opinions are those of the author and do not necessarily represent the position or views of the institutions they represent or the Center for European Policy Analysis.

Read More From Bandwidth
CEPA’s online journal dedicated to advancing transatlantic cooperation on tech policy.
Read More