A curious thing happened last week. A staggering number of the 7,800 encrypted devices the Federal Bureau of Investigation (FBI or Bureau) has deemed inaccessible are accessible after all. It turns out the number of inaccessible devices is actually somewhere between 1,000 and 2,000 devices – a significant number, but a fraction of what the FBI claimed. The inability for investigators to access encrypted data is commonly referred to as “going dark.” It is important to clarify that encryption actually prevents crime by rendering sensitive data unreadable, making it completely unattractive and unavailable to cyber criminals. The “going dark” problem to which the Bureau refers describes their inability to access encrypted data after a crime has occurred.

The number of devices made inaccessible by encryption is an important data point for the FBI as it tries to convince policymakers to require tech companies to provide encryption keys to law enforcers. The inflated number was likely used to help quantify the volume of information made unavailable to them as a result of encryption – the number served as a proxy for the scope of the challenges they faced by “going dark.” But now that recent reports have shrunk that number to a third of its projected size—due to a “programming error”—it is high time to take stock and reevaluate the debate.

The Washington Post broke the story on May 22, reporting that the FBI said it accidentally inflated the number of encrypted devices it could not access. This revelation comes on the heels of the FBI’s own Inspector General report which found that FBI staff ignored available methods to access encrypted content on an iPhone, presumably to press its legal case to compel Apple to decrypt the data.

Thus far, it has been a bad year for the FBI’s credibility. This type of puffery undermines the integrity of its own investigators’ work as well as the work of state and local law enforcers. We must be able to trust the FBI when it makes a claim. More importantly, these inflated claims do not help policymakers best evaluate the debate and available policy options. Committees in the House and Senate are responsible for making the policies that guide law enforcement’s authority to investigate crimes; when these committees receive false information from a trusted government agency, it undermines their ability to strike the right balance on behalf of the American people. And when one law enforcement branch is caught playing fast and loose with the facts, other law enforcers are painted with the brush of untrustworthiness. This is not fair to law enforcement or the criminal investigation system.

Investigators have a difficult and extremely important job. Their mission is to keep us safe and protect our property and basic freedoms. While encryption and other technical measures help prevent crimes from happening, law enforcers investigate crimes after they have occurred to ensure offenders are penalized. Embedded in this statement is the fact that while encryption may deny investigators access to some information, its use as a preventative tool also effectively serves investigators’ goals. Law enforcement officials know better than anyone that we are all in an arms race with cyber criminals—they know the value of pushing the private and public sector to build defenses and craft investigative strategies to stay one step ahead.

If solely considering investigative goals, there is a temptation to couch the encryption debate as “tech versus law enforcement” because encryption frustrates investigations. Those who see it in those terms might say law enforcement claimed a temporary moral high ground in the wake of the Cambridge Analytica and Facebook melee. Rumor had it that some, smelling blood in the water, were calling it the “Snowden of the private sector.” But whether you see it in such black and white terms or not, the debacle undoubtedly reinvigorated public concerns about how private firms handle data and underscores the importance of strong encryption. If the FBI is to be believed (and we know at least some of what they have been saying is untrue), they could convince policymakers to require the very tech companies accused of mishandling data to maintain a known vulnerability in their encryption for law enforcement to access or nefarious actors to exploit. It is telling that the Bureau is asking for the right to handle extremely sensitive encryption vulnerabilities when it has been relying on data that is incorrect by orders of magnitude due to a “programming error.” We can chuckle at the irony, but even an extremely technically proficient agency should not presume its ability to protect known vulnerabilities without issue.

The stakes are high and determine the security of our data. When we make purchases online, communicate with friends and family, or store important work in the cloud, we want to trust that those communications, transactions, and data are secure. Requiring an encryption backdoor would erode trust in these services. The App Association has emphasized the importance of encryption to consumer trust consistently over the years, most recently in an amicus brief in the San Bernardino case and in filings to foreign governments that have proposed requirements for backdoor access to encrypted data. Encryption’s ability to prevent crime from impacting consumers is integral to the trust our member companies rely on every day. The notion of mandated backdoors is so contrary to cybersecurity that many app developers new to the debate simply scratch their heads.

An executive from one of our members offered an apt analogy: Imagine telling your five-year-old child that you’ve hidden candy somewhere in the house and then leaving them unattended for several hours. The child will find the candy. Another member likened an encryption backdoor or known vulnerability to Where’s Waldo? The page has four corners and four sides and everyone knows Waldo is somewhere on that page. Nobody questions if Waldo will be found—it’s a matter of when.

Mandating companies to both possess a known vulnerability and try to safeguard the key when bad actors know where to find it gives criminals a tremendous advantage. Cybercrime is a game of odds. In overly simple terms, the likelihood of finding a vulnerability multiplied by its market value will roughly determine how much time and resources a cybercriminal is willing to pour into an attack. Their efforts become exponentially more worthwhile if they know the vulnerability, in fact, exists.

Not much has changed about this debate. However, the FBI’s emerging track record of exaggeration and evasiveness in order to persuade policymakers to mandate encryption backdoors leaves a reason for skepticism. There is one thing that isn’t exaggerated – if the Bureau gets its way, cybercriminals will feel like kids left at home alone, just waiting to find their candy.