The FBI’s apparent desire to require all communications service providers to design a means for law enforcement to access encrypted communications in plain text could have negative effects on personal privacy and industry innovation. Computer scientists, however, concentrate on a different concern: the danger that such design mandates could harm cybersecurity. In an ironic twist, the FBI’s well-intentioned attempt to ease their own access to suspicious phone calls might well provide criminals and others easy access to computer networks and the rich data they process.
Whenever communications providers of any description build a back door for law enforcement entry, there is a danger that other parties will sneak through without permission. In many countries, for example, there is a law requiring that phone companies provide law enforcement access to the data that crosses their networks. Here, that law is the Communications Assistance to Law Enforcement Act, or CALEA. However, the law enforcement entryways built by equipment manufacturers to satisfy the government sometimes spring open when clever hackers knock in the right way.
In 2004, a group of unknown individuals exploited the wiretapping backdoor in one of the major Greek cellular phone carriers. The exploiters spent months listening in on the calls of the prime minister of Greece, other senior officials, and the staff of the U.S. embassy in Athens, among others. Even though the breach was eventually accidentally uncovered, the hackers were never caught. Bruce Schneier, a leading expert on cryptography and security who has followed the case closely, notes that this is just one example demonstrating that the “risk is not theoretical.”
Now, the FBI seems to be proposing to extend these back doors to many more services that are not currently covered by CALEA.
CALEA applies only to telecommunications companies, broadband access providers, and certain types of VOIP (Voice over Internet Protocol) providers. The FBI now wants to require not just networks, but also communications application providers to turn over plain text to law enforcement on request. This would add back door design requirements to all the applications that run on top of those networks, such as Facebook, Skype, and email. Such a requirement would present many more tempting targets to bad actors.
To understand why, picture CALEA as an anti-money-laundering statue that requires banks to build a special back door directly into their vaults so that the FBI, if it wants to, can walk in and look through your safety deposit box for unexplained cash. This extra door is both a target for criminals and a temptation for bank employees (it makes it easier to grab your money and walk out), but at least it’s in a bank – the owners can try to mitigate these problems by hiring better security.
Suppose now the FBI is unsatisfied and wants to crack down harder, so it starts looking for better ways to track money. Then, the FBI asks wallet manufacturers to build in tiny cameras that keep track of the money you carry at all times so that they can take a peek remotely (when they have a court order, of course). The problem is that while information about what’s in your wallet might be valuable to the FBI, it’s even more valuable to criminals, who will know who has lots of cash and will make a tempting target. Because securing thousands of wallets is much harder than securing a bank vault, access to those cameras will be very hard to protect.
There are other, less risky answers to the FBI’s problem. Instead of putting a camera into everyone’s wallet, they should do what they have always done in the past – get a warrant to go grab the wallet of the person of interest. In the communications context, this does not necessarily require placing hands on the computer or cell phone of the target. Instead, it may mean getting a warrant to place a bug on that computer or cell phone – exploiting the device. Such a solution would give law enforcement access only to the devices it needs without increasing the vulnerability of all devices.
It is important to note that the marketplace for communications software and hardware is global. If back doors are built into American products, the keys to those doors will also be shared with law enforcement in other countries. Those other countries may then use those capabilities to spy on their citizens, not for law enforcement purposes, and not with court approval, but to stifle democratic movements and without checks and balances. In addition, we might find these back door capabilities turned back against our own companies, our own government and our own citizens.
Finally, the U.S. is one of the leading nations in cyber-exploitation, with a high level of expertise in gaining access to communications systems without the need for special back doors. By building back doors into common communications services, we reduce our advantage in the global cybersecurity context, opening our own communications to nations that would otherwise be unable to read them.
This article was written by Joshua Gruenspecht of the Center for Democracy & Technology.