Earlier this month, a legal clock ran out in Europe. On 3 April 2026, interpersonal communication services operating in the European Union lost key legal clarity that has, until now, allowed them to voluntarily detect, report, and remove child sexual abuse material (CSAM). The temporary derogation from the ePrivacy Directive, the legislation that gave services a clear legal basis to act, has expired. The measure was always intended as a stopgap, set to expire in April 2026, on the assumption that a permanent framework, concretely the proposed Child Sexual Abuse Regulation, would be agreed to in time.  Unfortunately, this has not been the case due to a lack of agreement between the European Parliament and Council on the scope, and what comes next is uncharted territory.

In 2025, the Internet Watch Foundation recorded its worst year on record for online CSAM, including a 26,395 per cent increase in AI-generated abuse videos. The EU is now at risk of becoming the only jurisdiction in the world where the ability of online services to detect and report CSAM is illegal. Offenders are still out there. They adapt, they migrate, and they exploit gaps. Policymakers cannot afford to leave one open.

Why the CSA Regulation Has to Get This Right

The answer to this gap is, in principle, straightforward: a durable, permanent Child Sexual Abuse (CSA) Regulation that provides a clear legal basis for voluntary CSAM detection. ACT supports that goal. What we cannot support and what the negotiations over the CSA Regulation must not produce is a regulation that helps protect children by dismantling the encryption infrastructure that underpins the entire digital economy.

Restricting encryption can have devastating consequences. The United States learned it the hard way. The Salt Typhoon breach, in which a China-backed hacking group likely exploited lawful interception infrastructure built into U.S. telecommunications networks under CALEA (a U.S law that requires surveillance capabilities to comply with legal requests for information), gave attackers essentially unlimited access to unencrypted sensitive communications data. A backdoor built for law enforcement allowed adversaries to walk right in, leaving only fully encrypted communications protected.

Fortunately, the European Data Protection Committee has reached the same conclusion from a rights perspective, condemning requirements to disclose encryption keys as disproportionate measures that would weaken the level of protection of all communications and threaten the confidentiality of all exchanges. Sweden’s own Armed Forces stated explicitly that access requirements for end-to-end encrypted communications cannot be fulfilled without introducing vulnerabilities and backdoors that third parties can exploit.

Startups Have the Most to Lose

For the startups and independent developers that make up the backbone of Europe’s digital economy, encryption is foundational to their products and services.

A small health app handling patient data, a legal tech startup processing privileged communications, a fintech company moving money across borders: all of these businesses are only viable because their customers trust that the data they share is protected. That trust is built on strong encryption. Mandate a backdoor, and you do not merely weaken the encryption, you undermine the entire trust relationship between SME products and their users.

The Right Framework is Possible but Requires Precision

ACT is calling for action. We are calling for precision. There are approaches to CSAM detection that work without requiring access to encrypted content.

A well-designed CSA Regulation would do several things at once:

    1. Restore the clear legal basis for voluntary detection that expired with the ePrivacy derogation, ensuring services can act without legal jeopardy.
    2. Explicitly exclude any requirement that breaks or circumvents end-to-end encryption.
    3. Focus obligations on the spaces where intervention is technically feasible without compromising security architecture.
    4. Build in meaningful oversight, proportionality requirements, and review mechanisms to prevent scope creep.

None of this requires choosing between child safety and digital security. It requires the EU to be specific, technically grounded, and honest about what different approaches actually do.

The Clock is Running

Every day that passes without legal clarity is a day that reporting volumes may be lower than they should be, that investigations are delayed, and that children are at greater risk. ACT calls on the European Parliament, the Council, and the Commission to act on two tracks simultaneously.

First, the EU should introduce an interim measure to restore legal clarity for voluntary CSAM detection while the permanent regulation is finalised. The 2021 derogation model worked. A bridge instrument that maintains the status quo while negotiations proceed is the most direct way to prevent the immediate harm caused by the legal gap.

Second, the CSA Regulation negotiations must explicitly and unambiguously exclude any detection mechanism that would require breaking end-to-end encryption.

The EU has the opportunity to lead the world in demonstrating that child protection and digital security are not competing values, but complementary ones. A legal framework that gets this right would be a model for every jurisdiction grappling with the same challenge. ACT and its members are committed to contributing to a process that achieves both goals. We urge EU policymakers to act with the urgency and clarity the moment demands.