Weapons of Digital Destruction? The Grey Areas of Regulating Smartphone Encryption

By Connie Kwong

Source: CBS SF

Source: CBS SF

For most of us, typing in a password to unlock and use our smartphones is nothing more than second nature. We don’t think about the complex encryption programs that went into ensuring that the texts, photos, call history, contacts, browsing history, and other information in our phones stays safe and private. And we’re far less likely to consider how this seemingly simple action comes with substantial political implications. But new legislation introduced in California might change that.

On Jan. 20, California State Assembly Bill 1681 was introduced. Under AB 1681, smartphones that are manufactured on or after Jan. 1, 2017 and sold in California would have to be able to be decrypted and unlocked by the manufacturer or operating system provider. Any seller or lessor who fails to comply must fork over a $2,500 fine for each smartphone. The bill would also prohibit sellers or lessors from passing any portion of this fine onto purchasers’ prices.

According to the bill’s author, Democratic Assemblyman Jim Cooper, encryption poses an insurmountable roadblock in the legal process. “You have a murder victim and they are dead, and their phone is locked. Of course as a detective I would want to know who they talked to last, but I can’t access that phone?”

The same concerns have been applied to how encryption can complicate human trafficking, child pornography, terrorism, and other criminal cases. A similar smartphone encryption ban bill is making its rounds in the New York state legislature. Meanwhile, U.S. Senator Dianne Feinstein plans to introduce legislation requiring companies to provide encrypted data when presented with a court order.

Cooper’s bill emerges in the aftermath of last December’s tragic San Bernardino shooting. Currently, the FBI is still unable to unlock the shooters’ phones to search for possible evidence regarding their motivations. It’s true that terrorists and other criminals can take advantage of encryption tools to plan their heinous acts. Indeed, in a perfect (and simpler) world, obtaining a warrant for a suspect’s phone should be the only thing to check off the list before searching it. But Cooper’s bill also emerges just three years after Edward Snowden blew the whistle on the NSA’s surveillance programs. In other words, AB 1681 is perhaps too simplistic in its understanding of the digital space and privacy. Andrew Crocker, staff attorney for the cyber digital civil liberties non-profit Electronic Frontier Foundation, points out that while politicians like Cooper have good motives, “They aren’t taking in the whole set of facts or they wouldn’t introduce bills like this.”

So what is the “whole set of facts”? The question probably warrants an extremely long answer, but let’s at least consider some of the most important talking points on encryption. For starters, even if AB 1681 is passed, that doesn’t necessarily guarantee that it’ll be easily enforced, much less survive. U.S. Congressmen Ted Lieu (D-CA) has authored the Ensuring National Constitutional Rights for Your Private Telecommunications (ENCRYPT) Act, which bans states from mandating that smartphone manufacturers, developers or sellers alter the security features or decrypt their own phones. Lieu’s bill invokes the Constitution’s Interstate Commerce clause, asserting that states have no business in regulating devices that can be bought and used anywhere in the country.

Also, Assemblyman Cooper’s bill doesn’t even specify what kind of encryption would be banned. This is a glaring flaw that would likely cause inefficiencies in the enforcement process. There are differences between the three main types of encryption: 1) hashing, 2) symmetric, and 3) asymmetric. Judging from many legislators’ voiced concerns, end-to-end encryption, a type of asymmetric encryption, seems to be the problem child. End-to-end works such that only the unique recipient of a message can decrypt it, and not anyone in between. The mechanism behind end-to-end encryption is analogous to the lockbox for the UPS delivery man who shows up to the doorstep. While the delivery man (or anyone, theoretically) has the public key to open the box to deliver the item, only the recipient has the private key to unlock it. On phones and computers, these “keys” are programs that are mathematically generated to grant access.

Therefore, some would argue that the current encryption problem stems from how it was only just a few years ago that virtually all smartphones were decryptable by manufacturers and operating system providers. If law enforcement had a search warrant, the manufacturer or operating system provider could unlock and turn over the phone’s material. But since 2014, many newer smartphones and tablets automatically encrypt the hard drives so that data can only be accessed via the password set by the user. Many of these new functions can also be added to older models that update to the new operating system software. For example, Apple assures its customers that iMessage and FaceTime are protected by end-to-end encryption and can’t be accessed without the user’s password. While iMessage and SMS messages are backed up on iCloud, FaceTime calls are not stored on servers. As a result, companies can’t unlock protected material. Even if an officer has a search warrant for the phone, the data can only be accessed if the suspect surrenders the password.

The consequence is an intelligence gap problem that FBI Director James Comey calls “Going Dark.” In an October 2014 speech, Comey quipped, “We have the legal authority to intercept and access communications and information pursuant to court order, but we often lack the technical ability to do so.” But even if it’s true that the law hasn’t kept up with technology (policy lag is an inevitable problem in many issues, after all), we’d be mistaken to think that encryption is just a 21st century problem that emerged in the last few years just because people started buying smartphones.

In fact, data encryption has been a contentious topic since the 1990s. Computers were becoming fast enough to make routine encryption possible, and this triggered the “Crypto Wars” between the federal government and the growing Internet. The government sought to limit the public’s and foreign nations’ access to cryptography strong enough to resist national intelligence agencies’ (like the NSA) decryption, but those efforts ended in defeat for the feds and victory for today’s Internet economy. That’s why every major web browser and mobile operating system comes with built-in cryptography tools. That’s why we’re supposed to be assured that when we use the Internet, our activities are safe from eavesdroppers and hackers. That’s why we should be able to trust that even as technology changes, our Fourth Amendment rights aren’t violated.

This is hardly to say that policymakers should altogether abandon upholding valid personal and national safety concerns for the sake of the Fourth Amendment. What this really boils down to is passing pragmatic and appropriate policy. For instance, a report by the Berkman Center for Internet and Society at Harvard University points out that banning encryption protections won’t solve intelligence gaps once and for all. And in the age of Big Data, it’s unlikely that companies will uniformly adopt end-to-end encryption, because many of them rely on access to user data to inform business decisions. The rise of connected devices and the Internet economy provides further avenues for surveillance that don’t rely on removing encryption protections. For instance, metadata, such as location data, call records, and email information, are not protected by end-to-end encryption. This is because unencryption is necessary for many smartphone functions to operate. Also, there’s no such thing as perfect end-to-end encryption. Hackers can still hack into computers to steal cryptographic keys or simply read the decrypted messages.

Cooper’s bill rightfully strives to improve safety, but in its current state, it could cause wrongful violations of privacy while abusing the excuse of security. And perhaps he’s aware that it needs tweaks, after telling reporters, “I’m OK with sending that phone to Apple with a copy of the search warrant and Apple sending the information back to law enforcement so there is no privacy issue.”
The power of information security is truly make-or-break, but the reality is that legislators aren’t necessarily seeking to revert to pre-2014 conditions in which backdoors in a smartphone’s operating system could consequently make it vulnerable to hackers. Several politicians have called on the tech industry to come up with a “front door” solution in which a key that works only for that specific smartphone can access only its contents. Many civil liberties activists and technologists are skeptical of this solution, arguing that it’s essentially impossible and encryption is simply here to stay. But considering how a free iPhone app already exists that successfully encrypts voice calls (yes, as in the calls that supposedly fall under the unencrypted metadata category mentioned above), it’s safe to say that nothing is impossible when it comes to technological innovation. We still have plenty to learn when it comes to balancing safety with technological innovation, and it’s unlikely that will change any time soon.

Leave a Reply

Your email address will not be published. Required fields are marked *