When law enforcement argues it needs a “backdoor” into encryption services, the counterargument has typically been that it would be impossible to limit such access to one person or organization. If you leave a key under the doormat, a seminal 2015 paper argues, a burglar eventually finds it. And now recent events suggest an even simpler rebuttal: Why entrust a key to someone who gets robbed frequently?
This aptly describe US intelligence services of late. In March, WikiLeaks released nearly 9,000 documents exposing the CIA’s hacking arsenal. More so-called Vault 7 secrets trickled out as recently as this week. And then there’s the mysterious group or individual known as the Shadow Brokers, which began sharing purported NSA secrets last fall. April 14 marked its biggest drop yet, a suite of hacking tools that target Windows PCs and servers to devastating effect.
The fallout from the Shadow Brokers has proven more concrete than that of Vault 7; one of its leaked exploits, EternalBlue, facilitated last month’s WannaCry ransomware meltdown. A few weeks later, EternalBlue and two other pilfered NSA tools helped advance the spread of Petya, a ransomware outbreak that looks more and more like an act of cyberwar against Ukraine.
Petya would have caused damage absent EternalBlue, and the Vault 7 dump hasn’t yet resulted in a high-profile hack. But that all of this has fallen into public hands shifts the nature of the encryption debate from hypothetical concern that someone could reverse-engineer a backdoor, to acute awareness that someone could just steal it. In fact, it should end any debate all together.
“The government asking for backdoor access to our assets is ridiculous,” says Jake Williams, founder of Rendition Infosec, “if they can’t first secure their own classified hacking tools.”
If you think about the encryption debate at all, it’s likely in the context of the 2016 showdown between the FBI and Apple. The former wanted access to San Bernardino shooter Syed Rizwan Farook’s locked iPhone; the latter argued that writing special code to break its own security measures would set a dangerous precedent.
That case ended in something like a draw. The FBI paid an outside company to break into the iPhone, quitting the court case before either side got a definitive ruling.