In an Oct. 10 speech at the U.S. Naval Academy in Annapolis, Maryland., Deputy Attorney General Rod Rosenstein made a case to step back from what the tech industry generally sees as an advance in security: “warrant-proof” encryption on devices that even court-authorized investigators can’t unlock.
Instead, he urged tech firms to adopt “responsible encryption”—as in, the kind “that allows access only with judicial authorization.”
As examples, Rosenstein pointed to “the central management of security keys and operating system updates” and “key recovery when a user forgets the password to decrypt a laptop.”
But granting that seemingly innocuous request could start to carve giant holes into your phone’s security.
You’ve seen this movie before
Rosenstein’s plea did not represent a new development. Past officials at Justice have said much the same thing, and President Obama used similar language last March at the SXSW conference.The standoff between Apple (AAPL) and the FBI last year over an iPhone 5c used by one of the San Bernardino attackers remains a primary exhibit of the issue here: Police fear that if they can’t unlock an encrypted device, they will miss important evidence.
Vendors like Apple and Google (GOOG, GOOGL), however, have customers who want secure devices, and keeping a backup key on the shelf for police and prosecutors thwarts that.
So, iOS and Android now encrypt a phone’s storage with a key that never leaves the device. Apple, Google, Facebook (FB) and others also offer messaging apps that can encrypt a conversation from end to end with keys confined to individual devices.
Advocates for preserving law-enforcement access generally don’t demand a particular back door into an encrypted system — unlike almost 25 years ago, when the Clinton administration tried to mandate a backdoored government “crypto” standard. They simply ask that the industry do something, anything, to let police do their job.
Two possible, problematic solutions
It’s easy to mock that vague demand as a case of Washington begging tech firms to “nerd harder.” But let’s consider two specific solutions.The most common suggestion is a form of the centralized “mobile device management” systems that organizations like the Federal Bureau of Investigation employ to control employee devices.
But consumer markets are far larger.
“Nobody has ever done this at anywhere near the scale we’re discussing,” explained Johns Hopkins University cryptography professor Matthew Green. “Apple alone has a billion active devices, and they have a *minority* of the smartphone market share.”
In a post last summer on the Lawfare blog, Matt Tait — formerly a technical specialist at the U.K.’s Government Communications Headquarters intelligence agency, now a cybersecurity fellow with the University of Texas at Austin — outlined a decentralized approach.
Imprison a decryption key on a phone, he wrote, in a series of encrypted envelopes each secured with a different key—perhaps one from the FBI, one from Apple, one from a digital-liberties group like the Electronic Frontier Foundation. All would have to cooperate to open the device.
“I think this remains the safest technical solution,” Tait said.
But Rich Mogull, CEO of the security firm Securosis, says that would also be a brittle or complex system.
“We are talking about either a small number of keys used for billions of devices, or some complex and fragile registry for a high number of keys,” he wrote. And as investigators inevitably demand their use in “criminal and civil cases big and small,” some will get compromised.
Expect more reruns of this debate
And yet device encryption is the easier scenario. As Tait wrote in another Lawfare post, two U.S. companies dominate mobile operating systems, and almost nobody installs alternate software.The market for messaging apps, however, goes far beyond big-name choices like Facebook Messenger. Many competitors are open-source software, leaving no one developer to target with a law or a warrant.
“End to end encryption is another beast entirely,” Tait said.
Meanwhile, iOS and Android phones continue to ship in massive numbers with encryption enabled. Why would Apple or Google backtrack on security on their own?
The government asking nicely won’t help, not when much of the tech industry fears the Trump administration for its stances on issues like immigration and net neutrality — and while increasing warrantless searches of devices at U.S. borders remind them of the importance of encryption.
It would take an act of Congress, literally, to compel these companies to do such a thing. But if there’s one thing Congress can’t do today, it’s enact major changes to tech policy.
That leaves one last option, one the FBI tried in the San Bernardino iPhone case: Compel Apple or Google to push a software update to a phone to disable its security.
That remains a possibility and has to — somebody must control software updates for them to be secure at all. Government exploitation of that single point of failure worries security experts too.
“Just one instance like that would essentially ruin all the educational efforts folks like me engage in to improve domestic cybersecurity,” said Joseph Lorenzo Hall, chief technologist with the Center for Democracy & Technology. Some people would then distrust bug-fix updates — and, by declining them, get stuck with less-secure devices.
It’s a fear we’ll apparently have plenty of time to ponder as this argument grinds ever onward.
“It never ends,” Mogull said. “This is a massive social debate that will likely not be fully resolved in our lifetimes.”