Amid proliferating efforts at both the federal and state levels to regulate children’s online safety, the U.S. Supreme Court (SCOTUS) recently issued its opinion in Free Speech Coalition v. Paxton. Addressing First Amendment concerns, the Court ruled in favor of upholding a Texas law that mandates age verification for online content deemed harmful to minors. To unpack the implications of the decision, the Cato Institute hosted a webinar titled What Would Online Age Verification Mean for Speech, Privacy, and Youth Online Safety? featuring:
- Graham Dufault, General Counsel at ACT | The App Association
- Ariel Fox Johnson, Senior Advisor for Data Privacy at Common Sense Media
- Thomas A. Berry, Director of the Robert A. Levy Center for Constitutional Studies at the Cato Institute
- Jennifer Huddleston, Senior Fellow in Technology Policy at the Cato Institute
Panelists explored the ruling’s significance, reviewed the current debate on children’s online safety bills, and discussed the analytical framework policymakers should use when evaluating options to intervene with kids’ online safety proposals.
The panel started with Thomas Berry offering a brief overview of Free Speech Coalition v. Paxton. Under the law in question, websites must verify users’ ages if at least one-third of their content is considered sexual material harmful to minors. The case focused on whether this mandate might violate the First Amendment by chilling adults’ access to constitutionally protected speech, particularly due to the loss of anonymity required to verify ages on the internet. While prior precedent suggested that such laws should face strict scrutiny, the Court instead applied intermediate scrutiny, finding that although the law did burden adult access, it was justified by the state’s interest in protecting children from harmful content online. The Court arrived at this result by holding that content deemed “harmful to minors” is not protected speech as it relates to minors, even though it is still protected speech with respect to adults’ access to it. Using the intermediate scrutiny framework, the Court upheld the law, opening the door for the enforcement of similar recently enacted laws in around two dozen other states.
Graham Dufault pointed out that some advocates for age verification mandates had already jumped on the Free Speech Coalition decision, arguing that it paves the way for blanket age verification mandates for access to any kind of speech on the internet. He highlighted the inaccuracy of that position, pointing out that SCOTUS’s decision was limited to facts involving harmful content to minors. Thus, Free Speech Coalition potentially creates a constitutional path only for age verification mandates as a prerequisite for access to unprotected, harmful-to-minors content—not for mandates impeding access to any kind of content. Graham urged policymakers to pay close attention to this nuance, as an age verification mandate imposed on all types of content—for example, on any app accessed through a major app store—is unlikely to draw the favorable SCOTUS analysis as the law in question in the Free Speech Coalition case.
Graham also noted that First Amendment protections are not the only problem with mandating compliance with age verification processes on any content regardless of its effect on minors. He explained how the Children’s Online Privacy Protection Act (COPPA), which requires companies to obtain verifiable parental consent before collecting data from children under 13, already sets clear rules for kid-facing content and may run headlong into any mandate requiring app developers to maintain age verification systems. Many App Association members have built products designed to serve kids and families, and for these developers, COPPA is a well-known framework. Complying with the law is costly in terms of time, money, and liability risk, so developers generally enter into the children’s content space with eyes wide open as to these costs.
Most developers, however, set out to solve practical problems in domains entirely unrelated to children’s digital experiences. For example, SwineTech, an AgTech enterprise software company based in Cedar Rapids, Iowa, began by developing a wearable device to help farmers prevent piglet crushing. It evolved into PigFlow, a software program that manages farm operations and enables farmers to spend more time with their pigs.
For a company like SwineTech, COPPA is likely a foreign concept. And yet, proposed age verification mandates unintentionally push them into the complexities and costs of COPPA compliance, even though PigFlow does not direct content to or collect data from minors. Under proposed measures like the App Store Accountability Act, app stores would have to verify users’ ages and pass that information to developers, giving companies like SwineTech “actual knowledge” that a user is under 13 and effectively subjecting them to COPPA’s provisions. But common sense suggests that minors are highly unlikely to use SwineTech’s app given its specialized agricultural content, and even if they did, its content is not dangerous or inappropriate. Moreover, developers would have to build the infrastructure necessary to receive flags and restrict content accordingly. Graham warned that such mandates effectively turn developers into “COPPA compliance factories,” requiring them to divert time and resources to compliance efforts that do not improve protections for kids at all and increase developers’ legal exposure, instead of their ability to innovate.
Graham also explained how requiring age verification at different levels of the internet stack, including websites, app stores, and devices, can impose broad, untailored mandates, leave significant compliance gaps, and undermine parental control. He noted that parents already have access to tools that support safer online experiences, such as features that allow developers to embed content filters into apps with messaging functions. However, when policymakers impose additional compliance requirements, they risk shifting control from parents to the government and forcing developers to prioritize legal compliance over product development. Moreover, mandating age checks at a single layer of the stack can leave persistent gaps and offer only an illusion of safety. As Ariel Fox Johnson pointed out, age verification at the app store level fails to address mobile websites or preloaded apps, while device-level checks can create problems for families who share devices.
The panelists then discussed how the Free Speech Coalition decision may influence future online safety legislation. Thomas explained that pornographic content is currently the only category of speech for which the Supreme Court has recognized a distinction between minors and adults. As a result, laws targeting other types of content, such as algorithmic design, face a tougher constitutional path. Although proponents of age verification laws often frame them as regulating product design rather than speech, Thomas noted that the Court has held that social media moderation constitutes protected speech under the First Amendment, making it difficult to separate platform design choices from expressive conduct. Because the Court has not recognized a lower standard for restricting access to general online content based on age, such laws, including California’s Age-Appropriate Design Code, remain vulnerable to First Amendment challenges.
Further, both the unresolved questions from Free Speech Coalition and the availability of alternative methods for protecting children call into question whether mandatory age verification is the most effective or privacy-protective approach. For example, Ariel pointed to signals like transaction data or social media patterns that could help companies infer a user’s age without requiring direct identity checks. Meanwhile, Graham emphasized that many App Association members already use age assurance tools based on those transaction patterns, following best practices such as data minimization and privacy-by-design principles, to safeguard their customers’ privacy and security. He suggested that empowering developers to adopt flexible, privacy-protective tools, rather than imposing rigid mandates, may be more effective at protecting children online without compromising innovation or privacy. As Graham also noted, Free Speech Coalition may not provide a reliable blueprint for future legislation, given that the Court made it clear the application of intermediate scrutiny did not extend to protected speech raising no age-specific risks to minors.
Protecting children’s online safety is a laudable goal that warrants thoughtful legislation to balance privacy, safety, and free speech. However, many of the current legislative proposals risk imposing overly burdensome restrictions on broad swaths of the internet. As Graham pointed out, sweeping mandates will effectively put the government—rather than parents—in control of children’s online experiences, causing developers to build tools for lawyers and compliance experts, rather than for parents. Moving forward, policymakers should carefully craft legislation to target legitimate harms and enable meaningful choices for families, instead of crushing the piglets to save the herd.