add my name to the list

At the tail end of last year, Crispin Robinson and Ian Levy of GCHQ published a co-authored essay on “suggested” ways around the “going dark problem” that strong encryption in messaging poses Agencies such as GCHQ and its (foreign) National equivalents. In that essay, the authors were at pains to state that they were not in favour of weakening strong encryption, indeed they said:

The U.K. government strongly supports commodity encryption. The Director of GCHQ has publicly stated that we have no intention of undermining the security of the commodity services that billions of people depend upon and, in August, the U.K. signed up to the Five Country statement on access to evidence and encryption, committing us to support strong encryption while seeking access to data. That statement urged signatories to pursue the best implementations within their jurisdictions. This is where details matter, so with colleagues from across government, we have created some core principles that will be used to set expectations of our engagements with industry and constrain any exceptional access solution. We believe these U.K. principles will enable solutions that provide for responsible law enforcement access with service provider assistance without undermining user privacy or security.

They went to outline what they called six “principles” to inform the debate on “exceptional access” (to encrypted data).

These principles are:

  • Privacy and security protections are critical to public confidence. Therefore, we will only seek exceptional access to data where there’s a legitimate need, that access is the least intrusive way of proceeding and there is appropriate legal authorisation.
  • Investigative tradecraft has to evolve with technology.
  • Even when we have a legitimate need, we can’t expect 100 percent access 100 percent of the time.
  • Targeted exceptional access capabilities should not give governments unfettered access to user data.
  • Any exceptional access solution should not fundamentally change the trust relationship between a service provider and its users.
  • Transparency is essential.

(I particularly like that last one.)

On first reading, the paper seems reasonable and unexceptional (which is probably what it was designed to do). It argues against direct attacks on end-to-end encryption itself and instead advocates insertion of an additional “end” to the encrypted conversation. So when Bob talks to Alice over his “secure” device, he would actually be taking to Alice and Charlie where Charlie had been added to the conversation by the device manufacturer or service provider and the notification to Bob (or Alice) of that addition would be suppressed so they would not know of the eavesdropping.

This is what they said:

So, to some detail. For over 100 years, the basic concept of voice intercept hasn’t changed much: crocodile clips on telephone lines. Sure, it’s evolved from real crocodile clips in early systems through to virtual crocodile clips in today’s digital exchanges that copy the call data. But the basic concept has remained the same. Many of the early digital exchanges enacted lawful intercept through the use of conference calling functionality.

In a world of encrypted services, a potential solution could be to go back a few decades. It’s relatively easy for a service provider to silently add a law enforcement participant to a group chat or call. The service provider usually controls the identity system and so really decides who’s who and which devices are involved – they’re usually involved in introducing the parties to a chat or call. You end up with everything still being end-to-end encrypted, but there’s an extra ‘end’ on this particular communication. This sort of solution seems to be no more intrusive than the virtual crocodile clips that our democratically elected representatives and judiciary authorise today in traditional voice intercept solutions and certainly doesn’t give any government power they shouldn’t have.

We’re not talking about weakening encryption or defeating the end-to-end nature of the service. In a solution like this, we’re normally talking about suppressing a notification on a target’s device, and only on the device of the target and possibly those they communicate with. That’s a very different proposition to discuss and you don’t even have to touch the encryption.

Neat huh? No need to go to all the bother of crypto attack, key escrow or any of the “magic thinking” around weakened encryption. Who could possibly object to that?

Well, lots of people could, and many did just that.

The Open Technology Institute, worked to coordinate a response from an international coalition of 47 signatories, including 23 civil society organizations that work to protect civil liberties, human rights and innovation online; seven tech companies and trade associations, including providers that offer leading encrypted messaging services; and 17 individual experts in digital security and policy. Those signatories included: Big Brother Watch, the Center for Democracy & Technology, the Electronic Frontier Foundation, the Freedom of the Press Foundation, Human Rights Watch, Liberty, the Open Rights group, Privacy International, Apple, Google, Microsoft, WhatsApp, Steven M.Bellovin, Peter G. Neumann of SRI International, Bruce Schneier, Richard Stallman and Phil Zimmermann amongst others

On May 30th 2019, they published an open letter to GCHQ giving their concerns at the proposals. In that letter they outlined:

how the “ghost proposal” would work in practice, the ways in which tech companies that offer encrypted messaging services would need to change their systems, and the dangers that this would present. In particular, the letter outlines how the ghost proposal, if implemented, would “undermine the authentication process that enables users to verify that they are communicating with the right people, introduce potential unintentional vulnerabilities, and increase risks that communications systems could be abused or misused.” If users cannot trust that they know who is on the other end of their communications, it will not matter that their conversations are protected by strong encryption while in transit. These communications will not be secure, threatening users’ rights to privacy and free expression. (my emphasis)

They went on to say:

  • The Proposal Creates Serious Risks to Cybersecurity and Human Rights.
  • The Proposal Would Violate the Principle That User Trust Must be Protected.
  • The Ghost Proposal Would Violate the Principle That Transparency is Essential.

They concluded that GCHQ should:

abide by the six principles they have announced, abandon the ghost proposal, and avoid any alternate approaches that would similarly threaten digital security and human rights.

Additionally, Jon Callas at ACLU has published a series of four essays which breaks down the fatal flaws in the proposal. Those essays in themselves are well worth reading, but so are all the additional papers (by people such as Steven Bellovin, Matt Blaze, Susan Landau, Whitfield Diffie, Seth Schoen, Nate Cardozo and many others) pointed to in those essays.

So: back in your box Levy, no-one wants your shitty little stick.

Permanent link to this article: https://baldric.net/2019/07/10/add-my-name-to-the-list/