steem

Friday, October 13, 2017

Deputy Attorney General Rosenstein’s “Responsible Encryption” Demand is Bad and He Should Feel Bad

Deputy Attorney General Rosenstein’s “Responsible Encryption” Demand is Bad and He Should Feel Bad

Deputy Attorney General Rod Rosenstein delivered a speech on Tuesday about what he calls “responsible encryption” today. It misses the mark, by far.

Rosenstein starts with a fallacy, attempting to convince you that encryption is unprecedented:


Our society has never had a system where evidence of criminal wrongdoing was totally impervious to detection, especially when officers obtain a court-authorized warrant. But that is the world that technology companies are creating.

In fact, we’ve always had (and will always have) a perfectly reliable system whereby criminals can hide their communications with strong security: in-person conversations. Moreover, Rosenstein’s history lesson forgets that, for about 70 years, there was an unpickable lock. In the 1770s, engineer Joseph Bramah created a lock that remained unpickable until 1851. Installed in a safe, the owner could ensure that no one could get inside, or at least not without destroying the contents in the process.


Billions of instant messages are sent and received each day using mainstream apps employing default end-to-end encryption. The app creators do something that the law does not allow telephone carriers to do: they exempt themselves from complying with court orders.

Here, Rosenstein ignores the fact that Congress exempted those app creators-“electronic messaging services”- from the Computer Assistance for Law Enforcement Act (CALEA). Moreover, CALEA does not require telephone carriers to decrypt encryption where users hold the keys. Instead, Section 1002(b)(3) of CALEA provides:


(3) Encryption. A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.

By definition, when the customer sends end-to-end encrypted messages—in any kind of reasonably secure implementation—the carrier does not (and should not) possess the information necessary to decrypt them.

With his faulty premises in place, Rosenstein makes his pitch, coining yet another glib phrase to describe a backdoor.


Responsible encryption is achievable. Responsible encryption can involve effective, secure encryption that allows access only with judicial authorization. Such encryption already exists. Examples include the central management of security keys and operating system updates; the scanning of content, like your e-mails, for advertising purposes; the simulcast of messages to multiple destinations at once; and key recovery when a user forgets the password to decrypt a laptop.

As an initial matter, “the scanning of content, like your e-mails, for advertising purposes” is not an example of encryption, “responsible” or otherwise. Rosenstein’s other examples are just describing systems where the government or another third party holds the keys. This is known as “key escrow,” and, as well explained in the Keys Under Doormats paper, the security and policy problems with key escrow are not only unsolved, but unsolvable.

Perhaps sensitive to the criticisms of the government’s relentless attempts to rename backdoors, Rosenstein claims “No one calls any of those functions a “back door.” In fact, those capabilities are marketed and sought out by many users.” In fact, critics of backdoors have fairly consistently called key escrow solutions “backdoors.” And any reasonable reader would call Google’s ability to access your email a backdoor, especially when that backdoor is used by unauthorized parties such as Chinese hackers.


Such a proposal would not require every company to implement the same type of solution. The government need not require the use of a particular chip or algorithm, or require any particular key management technique or escrow. The law need not mandate any particular means in order to achieve the crucial end: when a court issues a search warrant or wiretap order to collect evidence of crime, the provider should be able to help.

This is the new DOJ dodge. In the past, whenever the government tried to specify ‘secure’ backdoored encryption solutions, researchers found security holes – for example, rather famously the Clipper Chip was broken quickly and thoroughly.

So now, the government refuses to propose any specific technical solution, choosing to skate around the issue by simply asking technologists to “nerd harder” until the magical dream of secure golden keys is achieved.

Rosenstein attempts to soften his demand with an example of a company holding private keys.


A major hardware provider, for example, reportedly maintains private keys that it can use to sign software updates for each of its devices. That would present a huge potential security problem, if those keys were to leak. But they do not leak, because the company knows how to protect what is important.

This is a fallacy for several reasons. First, perfect security is an unsolved problem. No one, not even the NSA, knows how to protect information with zero chance of leaks. Second, the security challenge of protecting a signing key, used only to sign software updates, is much less than the challenge of protecting a system which needs access to the keys for communications at the push of a button, for millions of users around the globe.

Rosenstein then attempts to raise the stakes to near apocalyptic levels:


If companies are permitted to create law-free zones for their customers, citizens should understand the consequences. When police cannot access evidence, crime cannot be solved. Criminals cannot be stopped and punished.

This is a bit much. For a long time, people have had communications that were not constantly available for later government access. For example, when pay phones were ubiquitous, criminals used them anonymously, without a recording of every call. Yet, crime solving did not stop. In any case, law enforcement has been entirely unable to provide solid examples of encryption foiling even a handful of actual criminal prosecutions.

Finally, in his conclusion, Rosenstein misstates the law and misunderstands the Constitution.


Allow me to conclude with this thought: There is no constitutional right to sell warrant-proof encryption. If our society chooses to let businesses sell technologies that shield evidence even from court orders, it should be a fully-informed decision.

This is simply incorrect. Code is speech, and courts have recognized a Constitutional right to distribute encryption code. As the Ninth Circuit Court of Appeals noted:


The availability and use of secure encryption may … reclaim some portion of the privacy we have lost. Gov’t efforts to control encryption thus may well implicate not only the First Amendment rights … but also the constitutional rights of each of us as potential recipients of encryption’s bounty.

Here, Rosenstein focuses on a “right to sell,” so perhaps the DOJ means to distinguish “selling” under the commercial speech doctrine, and argue that First Amendment protections are therefore lower. That would be quite a stretch, as commercial speech is generally understood as speech proposing a commercial transaction. Newspapers, for example, do not face weaker First Amendment protections simply because they sell their newspapers.

The Department of Justice has said that they want to have an “adult conversation” about encryption. This is not it. The DOJ needs to understand that secure end-to-end encryption is a responsible security measure that helps protect people.

Source: Deputy Attorney General Rosenstein’s “Responsible Encryption” Demand is Bad and He Should Feel Bad | Electronic Frontier Foundation

No comments:

Post a Comment