Director Calls For ‘Responsible’ Solution That’s Not a Backdoor
FBI Director Christopher Wray says the agency was unable to access nearly 7,800 devices in fiscal 2017 because of encryption, which he alleges will pose ever-increasing complications for law enforcement.
See Also: How to Scale Your Vendor Risk Management Program
The agency had both the legal authority and the technical tools at hand to try and extract information, but failed. But while he says the FBI supports strong encryption, he maintains it shouldn’t undermine lawful access to the data.
“Each one of those nearly 7,800 devices is tied to a specific subject, a specific defendant, a specific victim, a specific threat,” says Wray, who spoke Monday at the International Conference on Cyber Security at Fordham University in New York. His prepared statement was posted on the FBI’s website.
Wray’s comments are in line with his predecessor, former director James Comey. Comey was dismissed by President Donald Trump last May.
“While the FBI and law enforcement happen to be on the front lines of this problem, this is an urgent public safety issue for all of us,” Wray says. “Because as horrifying as 7,800 in one year sounds, it’s going to be a lot worse in just a couple of years if we don’t find a responsible solution.”
What the FBI would like is a way for either technology companies or third parties to be able to decrypt the information. It’s a concept that’s anathema to cybersecurity professionals, who contend that it would create new hacking windows for cybercriminals and nation states.
“Shame on you, Wray,” writes Timo Laaksonen, F-Secure’s head of operator sales in North America, on Twitter. “If anything, breakable encryption would be a serious public safety issue.”
Shame on you, Wray. If anything, breakable #encryption would be a serious public safety issue #cybersecurity https://t.co/MCXNW5Nd9e
— Timo Laaksonen (@CloudTimo) January 9, 2018
The ‘Not A Backdoor’ Backdoor
The U.S. government sought in the 1990s to incorporate a so-called “Clipper chip” into telecommunications equipment, which was essentially a backdoor. Encryption keys used to secure transmissions would be retained in escrow, either by the government or another third party.
The plan was met with widespread skepticism over its security. Those suspicions were confirmed around 1994 when Matt Blaze – now an associate professor in the computer and information science department at the University of Pennsylvania – showed a key Clipper chip protocol was flawed, according to a 2015 paper by the Open Technology Institute. By the late 1990s, the Clipper chip was dead.
Wray maintains that an innovative solution could be developed: “Surely we should be able to design devices that both provide data security and permit lawful access with a court order,” he said.
“We’re not looking for a ‘back door’ – which I understand to mean some type of secret, insecure means of access,” he says. “What we’re asking for is the ability to access the device once we’ve obtained a warrant from an independent judge, who has said we have probable cause.”
In 2016, the FBI dropped a legal challenge against Apple, which refused to create a special version of its iOS mobile operating system to let investigators access an iPhone used by one of the San Bernardino shooters (see Legal Issues Persist as FBI Backs Off in iPhone Case).
The FBI dropped its challenge after it obtained access to the device, leaving the legal question of whether the government could compel software developers to undermine their security defenses undetermined.
Technology companies such as Apple, Google and Facebook have sought to strengthen how their products encrypt content and wherever possible ensure that service providers themselves hold no encryption keys. That impetus was sparked in part by increasing cybercriminal activity as well as former National Security Agency contractor Edward Snowden’s disclosures in 2013, which revealed the intelligence agency’s large-scale surveillance efforts (see New Snowden Leak Details NSA Collection Program).
As a result, many law enforcement agencies must either guess passwords, obtain those passwords from a suspect, or as a last resort use technical methods such as software exploits to gain access to data.
But Wray cites an interesting example of how private companies have dealt with one chat application, called Symphony, due to regulatory concerns. Symphony communications are encrypted, with the keys held by the entity running the application.
Wray says that New York’s Department of Financial Services reached an agreement with four banks to logs chats and communication for employees using the app for seven years. The banks also store copies of decryption keys with an independent custodian to ensure access to the data by regulators.
“So the data in Symphony was still secure and encrypted – but also accessible to regulators, so they could do their jobs,” Wray says.
That’s key escrow. But there’s a big difference between what the banks implemented and software applications that get deployed to the general population.
Employers generally have the right to monitor communications of employees on company networks. But requiring key escrow for public users raises a host of privacy concerns, including concerns about repeat, unexpected governmental overreach of the type Snowden’s whistleblowing revealed.
Or as Kurt Opsahl, executive director and general counsel of the Electronic Frontier Foundation, said via Twitter: “This statement is true – strong encryption does help make the public safer. Director Wray’s reasoning and call for weaker crypto, however, is exactly wrong.”