In a global pandemic, we are embracing the online world as our “new normal.” If we didn’t already, we now conduct our work, shopping, and social lives on the web more than ever. So, it comes as no surprise that privacy and its close friend encryption have entered the scene, albeit in a context no one saw coming.

We’ve talked a bit about encryption recently on our blog, but for this installment the focus is on encryption developments months to nearly a year prior to the pandemic, as legislators and the Department of Justice (DOJ) alike began exploring encryption as it pertains to the proliferation of child sexual abuse material (CSAM) online.

Encrypted information passes through platforms every day as part of a variety of important activities from database management to payments. But unless it’s unencrypted, platforms can’t interpret what information is included in the transfer—the very nature of end-to-end encryption ensures only the sender and recipient can see what is sent in this process. Over the last few months, Attorney General William Barr repeatedly called for law enforcement to have easier access to this information. As outlined in his keynote address at the International Conference on Cyber Security last July, he seems to believe that mandating a platform-held access key for law enforcement does not materially weaken encryption and likens such access to enforcement access as that granted by the Communications Assistance for Law Enforcement Act (CALEA) of 1994. There are a couple of things to question about this interpretation. First, the internet is not a telephone. The intent of CALEA is to enable law enforcement to more easily wiretap telephone lines, and while wide-spread adoption of the internet was conducted over dial-up, it’s safe to say the internet of today is a little more complex than a landline. Additionally, while phone conversations may be private and personal, the data that traverses the internet can pose even more serious risks as (1) there are greater volumes of it and (2) bad actors can probably find more ways to harm consumers with copies of it (including in unforeseen ways) than they could with a recorded phone conversation. Second, common sense says that if you build an access point into a product—even if it is only intended for the “good guys”—it is only a matter of time until bad actors find it. Known vulnerabilities invite bad actors.

AG Barr found a more clever hook on December 10, 2019, when he gave a speech as part of the National Association of Attorneys General (NAAG) 2019 Capital Forum. The new regulatory vehicle was Section 230 of the Communications Decency Act (CDA) of 1996. For those unfamiliar with 230, this law provides immunity from liability for providers that publish the information of others online. In other words, under Section 230, platforms are not legally responsible for the content posted by their users. In this speech, AG Barr argued that this immunity goes too far, implying that there’s no incentive for platforms to respond as they should to CSAM, nor similar material. Moreover, he reiterated “technological innovations that purport to protect privacy at all costs—while impeding sworn law enforcers’ ability to go after violent criminals, child predators, human traffickers, and terrorists, even once the enforcers satisfied the rigorous privacy protections built into the Fourth Amendment—may not be worth the trade-off.” While AG Barr has a point, it makes one wonder at what cost he intends companies to decrypt or not encrypt user data.

Similar thinking seemed to percolate on the Hill—also on December 10, 2019—when the Senate Judiciary Committee convened a hearing titled “Encryption and Lawful Access: Evaluating Benefits and Risks to Public Safety and Privacy” featuring witnesses from the New York County District Attorney Office, the University of Texas at Austin, Apple, and Facebook. Unfortunately, the hearing itself was not nearly as evaluative as the hearing title indicated as most Senators came to the hearing having already made up their minds about encryption with all but one Senator in attendance indicating they’d like to see a backdoor for law enforcement to use, specifically in the instance of curbing CSAM online. Sen. Mike Lee of Utah stood alone in urging caution on creating backdoors, saying in his closing remarks,

“I want to be clear about the fact that I don’t regard this as an easy set of problems to solve. I think we do ourselves as the Senate, as an institution, and those we serve, a grave disservice when we allow this conversation to descend into a contest over who loves children and who acts with reckless disregard toward them… I worry a lot about some of the discussion we’ve had today because much of it has focused on this sort of reductio ad Hitlerum strategy that it attempts to demonize certain individuals. When in fact, we’re not dealing with something with the circumstances where we can easily identify one way that it protects children and one way that disregards them.”

 

Chairman Lindsey Graham of the Senate Judiciary Committee released a bill to address just the CSAM issue by using Section 230 as a legislative hook. The bill, the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act, would put the onus back on platforms to “self-regulate” CSAM. As written, the legislation would establish a National Commission on Online Child Exploitation Prevention to develop a set of “best practices” to prevent online child exploitation. The proposed best practices in effect would defer nearly all authority to DOJ—who, as we have seen, are keen on jailbreaking encryption. In other words, if DOJ has the final say in this bill—and it does—the best practices will make it impossible for a platform to encrypt data without maintaining a key for law enforcement. Here’s the kicker: the bill would amend Section 230 to say that unless platforms adopt these best practices, the platforms won’t get 230 protection against civil suits involving CSAM (there is no criminal liability shield for CSAM in 230). Instead, platforms would be liable for the content users post. Fittingly, the bill’s acronym spells “EARN IT,” as it would require platforms to “earn” their Section 230 protections. In summary, the bill uses Section 230 as a backdoor to mandating backdoors.

The Senate Judiciary Committee held a hearing on the bill on March 11, 2020, pitting witnesses representing tech companies against child protection advocates on the very dark subject of preventing CSAM. To make matters worse, Senators Graham and Blumenthal repeatedly defended EARN IT as having nothing to do with encryption when it was clear the greatest hurdle to law enforcement in tracking the spread of CSAM online is encrypted communications. This was a set-up from which no winners could emerge—defending the current privacy policies of companies was tantamount to being part of the proliferation of CSAM, while advocating for legislation like EARN IT would compromise the secure internet as we know it.

As Senator Mike Lee alluded in the December 10 Senate Judiciary Committee hearing, we need to be deliberate regardless of the type of crime under investigation. It isn’t immediately clear how law enforcement agencies and tech companies can prevent the distribution of CSAM while balancing expectations of privacy. But encryption is a function of complex mathematical equations, and as we all know, compromise on math is not available in the same way compromise can be made in crafting a policy solution.