What app developers need to know about the DOJ’s formal request for encryption backdoors

What app developers need to know about the DOJ’s formal request for encryption backdoors Blank Rome is an Am Law 100 firm with 14 offices and more than 600 attorneys and principals who provide comprehensive legal and advocacy services.


When we text via Apple’s iMessage, WhatsApp, Signal, or a host of other messaging services, those messages are protected in ways communications across most other platforms are not. The reason is these select app developers use “end-to-end encryption” (or “E2EE”), which encrypts all messages before leaving the sender’s device and can only be decrypted by the recipient’s device. The only way to access and view these messages is by having the sender’s or recipient’s phone unlocked and the apps open. Importantly, neither the app developer themselves nor the Internet service providers (AT&T, Verizon, etc.) can see the encrypted communications, even if they are stored on one of their servers. What’s more, the government can’t see those messages either, even when it gets a warrant.

The U.S. Department of Justice (“DOJ”), and several foreign governments, want to change this latter proposition. 

On October 11, 2020, U.S. Attorney General William Barr and officials from Canada, the United Kingdom, Australia, New Zealand, India, and Japan issued a joint statement calling on technology companies to “[e]nable law enforcement access to content” when law enforcement obtains lawfully issued authorization to obtain such information. In other words, these officials are calling for app developers to provide law enforcement with backdoors to the encrypted apps.

This joint statement raises several challenging questions.  For instance, what does this mean for end-to-end encryption? How should platforms that already provide end-to-end encryption react? How should platforms thinking about providing end-to-end encryption prepare? We discuss here.

Background of End-to-End Encryption and Government Policy

End-to-end encryption is a form of communication whereby only the communicating users can read the messages. The technology essentially prevents eavesdropping—by telecommunications service providers, Internet providers, and the app platforms themselves—from seeing users’ communications or even being able to access the cryptographic keys needed to decrypt the conversations.

Proponents of E2EE laud its abilities to enhance privacy and protect users from data hacking while data is in transit. Moreover, E2EE protects more than just text messages. These tools protect banking, credit, and retail sales data, and are used by businesses and consumers in the transportation, manufacturing, logistics, healthcare, and defense industries, among many others. Indeed, as the signatories themselves recognize, encryption in general plays a vital role in “protecting personal data, privacy, intellectual property, trade secrets and cyber security,” and also serves to protect “journalists, human rights defenders, and other vulnerable people” in repressive countries. Similarly, the United Nations Human Rights Council recently noted the significant role privacy plays in free societies.

E2EE proponents likewise see real issues with law-enforcement backdoors: Hackers, as well as adversarial foreign governments, seeking to decrypt user data conceivably could exploit those very backdoors to steal the data E2EE was designed to protect.

On the other hand, the signatories to the joint statement and other opponents of E2EE see E2EE as posing a threat to the ability of countries to investigate crime and maintain law and order. 

Without a doubt, one function of government is to protect its citizens by prohibiting them from possessing and using certain tools and technologies. Most people agree that governments should prohibit individual citizens in all countries around the world from possessing or using things like atomic weapons and anthrax. Why? Because the safety of everyone in the vicinity of dangerous weapons and hazardous materials far outweighs the government intrusion.

The question, then, is whether end-to-end encryption poses such a significant threat to the safety and welfare of individuals such that those costs outweigh the benefits of the technology’s use. The signatories to the joint statement and other opponents of end-to-end encryption see it as creating a significant public-safety cost, because such encryption allegedly poses a significant obstacle to law enforcement officers being able to do their jobs. 

In particular, opponents of E2EE note how criminals involved in terrorism and child exploitation misuse the technology to avoid detection of their crimes by law enforcement. And—while not noted in the joint statement, but certainly relevant to the DOJ and other countries—members of large-scale narcotics-trafficking and money-laundering organizations also use these encryption tools to keep their conspiratorial conversations private and hidden away from the eyes of law enforcement. 

With these concerns in mind, the joint statement calls on app developers to take the following steps:

  • Embed the safety of the public in system designs, thereby enabling companies to act against illegal content and activity effectively with no reduction to safety, and facilitating the investigation and prosecution of offences and safeguarding the vulnerable;
  • Enable law enforcement access to content in a readable and usable format where an authorization is lawfully issued, is necessary and proportionate, and is subject to strong safeguards and oversight; and
  • Engage in consultation with governments and other stakeholders to facilitate legal access in a way that is substantive and genuinely influences design decisions.

The second bullet is where the action is: the signatories—including Attorney General Barr—want app developers to create law enforcement backdoors in the software that would allow officers to read and intercept otherwise encrypted communications when authorized to do so.

The Attorney General does not stand alone in the United States in seeking a law-enforcement backdoor. In June 2020, Senators Lindsey Graham, Tom Cotton, and Marsha Blackburn introduced the Lawful Access to Encrypted Data Act. The bill would require app developers and other tech companies to assist law enforcement in executing warrants that seek encrypted data. Importantly, the bill would authorize judges to order tech companies to assist in accessing information sought via warrant, including by decrypting or decoding data. The bill also would require certain companies to ensure they have the technical capabilities to provide this assistance. Notably, the Attorney General issued a statement “applaud[ing]” this bill.

Whether to require app developers offering E2EE to users to create law-enforcement backdoors is a hot policy debate that likely will rage for the foreseeable future. Encryption technology will only improve—along with hackers’ abilities to defeat encryption technology and steal data. Meanwhile, more and more criminals will continue to exploit the privacy these sophisticated encryption tools provide.

Considerations for App Developers with E2EE and E2EE Ambitions

E2EE creates risks for app developers. Consumers—most of whom may not understand the nuances between various methods and tools for encrypting data—clearly want strong encryption and take comfort knowing hackers and spies will not be able to steal their personal information. That consumer demand places business pressures on app developers to provide E2EE and other data security tools.

But app developers face litigation risks from all sides. Today, many consumers might assume all communications apps include some level of sophisticated data protection and encryption, whether E2EE or something similar. When user data is hacked via weak security in these apps, the app developers themselves become likely targets for suits alleging negligence or other causes of action arising from the data breaches. On the flip side, app developers whose apps include impenetrable security features like E2EE face suits when criminals (like sexual predators) exploit those security features to their advantage.

Without a doubt, then, app developers cannot sit idle. The following are some best practices app developers can consider:

  • Be Clear About Security Features. Many app developers advertise some of their security features prominently. But each app provider should be clear and concise about the encryption and security features it uses. App developers might even want to explain to consumers why they do or do not provide security features, like E2EE, to help consumers choose which services to use and avoid. Parents, for example, might choose for their minor children only to use communications apps that do not use E2EE , as child exploitation criminals might be less likely to approach children on those apps.
  • Include Waivers in Terms of Service. App usage is not a consumer’s right. App developers can craft their terms of service to limit liability for their choices of security features. Doing so can significantly limit app developers’ scope of potential exposure, which is especially important as the E2EE debate remains unresolved. 
  • Include Acceptable Use Policy in Terms of Service. The joint statement suggests that app developers and other tech companies have a duty to put in place terms of service that provide the authority to protect the public. For these reasons, app developers should ensure they include acceptable use policies (“AUPs”) in their terms of service. An AUP should describe all prohibited uses of an app and make clear that, in the event of any failure to observe the guidelines set forth in the AUP, the company reserves the right to take immediate active corrective actions up to and including disabling the user’s account or terminating the user’s ability to use the app. 
  • Know Your Customers. Currently, no regulations impose any Know-Your-Customer (“KYC”) obligations on app developers. Indeed, users often can activate accounts with pseudonyms and effectively remain anonymous. But app developers may want to consider some level of customer verification and impose some basic restrictions on account creation and app use. For example, app developers could prohibit anyone required to register with a state as a sexual offender from opening, controlling, or using an account.
  • Lobby, Lobby, Lobby. The debate between data privacy and public safety is taking place right now in Congress and state legislatures across the country. To the extent app developers want to continue to provide E2EE and, more generally, sophisticated data privacy security features, lobbying senators and representatives throughout Congress and statehouses could prove effective. 

Conclusion

EE2E is, by itself, neither good nor bad; it all turns on how and why it is implemented. App developers are thus encouraged to carefully weigh the above considerations and to engage experienced counsel to help avoid the obvious—and not-so-obvious—pitfalls.

(Photo by Dima Pechurin on Unsplash)

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

Tags: , , , , , , , , ,

View Comments
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *