At the Federal Trade Commission’s (FTC) recent workshop, The Attention Economy: How Big Tech Firms Exploit Children and Hurt Families, panelists raised important questions about children’s online safety but offered few practical, evidence-based solutions. Instead, they promoted policies, such as the App Store Accountability Act and similar age verification measures, that misapply fundamental legal principles, raise significant privacy, security, and implementation concerns, and ultimately fail to deliver meaningful protections for children online.

First, panelist Melissa McKay’s assertion that app stores must obtain parental consent because “we don’t allow kids to enter into contracts anywhere else” misunderstands existing contract law and the role app stores play within the app ecosystem. As we have pointed out, minors routinely enter into contracts in daily life, from purchasing packs of gum at the store to navigating general audience websites. State laws protect them by making most of these contracts voidable, rather than prohibiting them from entering contracts entirely. Further, in advocating for app stores to obtain parental consent, panelists conflate two distinct contractual relationships. Independent app developers are not subsidiaries of the app stores. The fact that a user has an app store on their smartphone does not mean they have agreed to the terms of service of an app distributed through that store. To access content within an app, the minor must separately agree to that app’s terms of service, making a parental consent requirement for app stores both redundant and ineffective. Thus, forcing parents into contracts with app stores on behalf of minors does nothing to avoid implied contracts between developers and minors that arise from minors’ use of their apps.

Moreover, while panelists pitch age verification as a panacea, in practice it introduces significant privacy and security risks while failing to meaningfully protect children from harmful content online. For example, document-based verification requires users to upload government-issued identification, which necessitates the disclosure of extra information, such as addresses and physical characteristics, and creates barriers for those who may lack valid identification for legitimate reasons. Although Representative Schlegel stated that age verification can be done in a manner that protects people’s privacy, her observation that privacy protections built into some voluntary verification measures have incrementally improved both overstates the efficacy of these protections and ignores the legal reality that these privacy benefits vanish under verification mandates. As even the California Consumer Privacy Agency staff has acknowledged, “there is currently no privacy-protective way to determine whether a consumer is a child.” Similarly, mandates to verify undo privacy and security protections by necessitating collection, retention, and association of personal information in order to prove compliance that would otherwise be unnecessary without legal requirements. Even more fundamentally, requiring app stores to verify users’ ages does little to shield children from online harms. Children can still access content through shared devices, web browsers, or apps downloaded outside of traditional app stores. These shortcomings underscore that age verification, as currently proposed, is not a practical or effective method for safeguarding children online.

Finally, though panelists rightly noted that policymakers need to take action to improve children’s online privacy and safety, any effective response must be grounded in a clear understanding of the problem, thoughtful enforcement, and a comprehensive approach to privacy, rather than exercises in liability shunting, such as the App Store Accountability Act. To that end, policymakers should take three steps. First, policymakers should fund outreach campaigns to ensure guardians understand how to use the robust parental control features available now. Currently, the use of smart device parental control features ranges from 51 percent on tablets to 35 percent on video game consoles, suggesting outreach and education must be a priority for public officials. Second, policymakers should enforce existing laws targeting online harms and make targeted updates allowing independent organizations to facilitate verifiable parental consent (VPC) without imposing responsibility on the VPC provider for how it is used downstream. Finally, Congress should pass a comprehensive federal privacy law that protects all consumers. These policies would effectively advance children’s privacy and online safety without compromising digital rights or placing undue burdens on developers.

To protect children’s privacy and online safety, policymakers should focus on legislation that delivers achievable, practical protections. While the FTC and panelists at the workshop raised serious concerns, lasting, effective solutions will require moving beyond rhetoric to develop not just sound bites, but sound policy that protects children, supports innovation, and safeguards user privacy.