In May 2025, Senator Mike Lee (R-UT) and Representative John James (R-MI) introduced the App Store Accountability Act (ASAA), a bill that would require app stores to verify users’ ages and obtain parental consent for users under 18. Meta has bankrolled a wildly expensive lobbying campaign to enact ASAA and its state-level analogs, and instead of recoiling in horror at taking kid privacy advice from Meta, some lawmakers are credulously going along with it. And what state legislators and members of Congress may not fully appreciate is exactly how the measures would get Meta off the hook. Why is the social media platform so willing to shoulder the extra burden ASAA would impose on them (not to mention imposing $70 billion in compliance costs on the rest of the ecosystem)? The answer is simple: Meta faces a one-time potential fine under federal child privacy laws of tens of billions of dollars the moment it “actually knows” users under 13 are on its platforms—that is, unless this actual knowledge rests instead with, oh, say, the app stores. Enter the ASAA.

You might be wondering, doesn’t everyone know there are millions of kids under 13 using Meta platforms? Yes, but Meta doesn’t have “actual” knowledge of it from a legal standpoint. The ASAA measures would effectively neutralize this “actual” knowledge problem. To understand how this would work, look to the Children’s Online Privacy Protection Act (COPPA) and Meta’s history with the law. COPPA only applies if an online service operator has “actual knowledge” as to a user’s under-13 status or if its service is directed to children. Since Meta intends for Instagram and Facebook to be unavailable to children under 13, Meta’s “actual knowledge” as to a given child’s under-13 status is the trigger at issue. It is not dispositive that parents, kids, and Meta all constructively know that millions of under-13 kids are likely using Meta platforms restricted for ages 13 and up. Constructive knowledge is not the same as “actual” knowledge. And yet, it is getting harder for Meta to avoid the “actual knowledge” level of awareness as age assurance tools and techniques improve, and policymakers ratchet up pressure on the platform to know children’s ages.

Meta faces a sticky situation as it looks for ways to purge under-13 kids it technically doesn’t know are using its platform. Most nettlesome is the fact that such a purge would require or at least strongly evidence “actual knowledge” as to the under-13 status of its underage users. While Meta would be free of its underage problem going forward, the FTC would likely have ample evidence for the largest fine in COPPA history by orders of magnitude. Meanwhile, if ASAA were enacted, the app stores would be charged with doing the age verification homework underlying “actual knowledge.” In that scenario, Meta would simply be responding to a flag from the stores, prompting removal of users known to be under the age of 13 via the app store-provided flag. The various ASAA bills even include a provision clarifying that developers are “not liable” under the bill’s age category verification requirement if they “relied in good faith on age category” received from an app store. This would emphasize that Meta would simply be relying on a flag, enabling them to claim no actual knowledge of their own. Meta’s “actual knowledge” having been wiped clean, it would be harder for the FTC to show the platform operator’s longstanding and growing liability under COPPA.

Under COPPA, companies can face civil penalties of up to $53,088 per violation for allowing kids under 13 on their platforms. The most common COPPA infraction involves a failure to first obtain verifiable parental consent (VPC) prior to collecting personal information about children. For example, in 2022, the FTC fined U.S.-based game development company Epic Games a record $275 million for completely ignoring its VPC and other obligations under COPPA. That year, Epic’s reported Daily Active Users peaked at 34.3 million users. By comparison, Meta’s 2022 Daily Active Users peaked at 2.96 billion users. While the two user bases are not directly comparable, a similar fine for children’s privacy violations, scaled to Meta’s user base, could exceed $23 billion. More concretely, the 2023 legal complaint brought by 33 Attorneys General stated that Meta had received more than 1.1 million reports of Instagram users under the age of 13 since 2019 but closed only a small percentage of those accounts. Other data sources corroborate this, estimating that eight percent of 8- to 12-year-olds say they have used Facebook, 10 percent have used Instagram, and nearly 1 in 5 say they use social media every day, despite the fact that social media platforms’ policies generally disallow use by children under 13. If the 1.1 million under-13 users were treated as individual violations of COPPA, the theoretical penalty could exceed $50 billion.

The idea of ASAA is to shift “actual knowledge” to the broader ecosystem so that Meta can avoid such a fine for its years-long, ongoing decision not to comply with COPPA. For small business app companies like the App Association’s members, a conservative estimate of the total cost to come into compliance with the requirement to receive flags and show compliance with ASAA is $70 billion. The ongoing costs of maintaining the systems to receive app store flags would obviously range higher, but let’s focus on the initial cost for now. The $70 billion in compliance costs would also buy Meta a get-out-of-jail-free card from a potentially $50 billion fine from the FTC. To put it in stark terms, Meta is asking legislatures to transfer somewhere around $50 to $70 billion from all other developers over to Meta.

Moreover, if Meta was concerned about the welfare of children online, it would redirect the lobbying resources dedicated to shunting its COPPA liability toward making its own platforms safer and preventing underage users from encountering inappropriate or illegal content. Instead, the company has a long track record of prioritizing engagement and growth over child safety, including promoting AI-powered chatbots that engage in explicit conversations with minors, exploiting youth psychology to drive use, and recruiting kids and tweens to its platforms.

Protecting children’s online privacy and safety is a laudable objective. Unfortunately, kids’ online safety debates have largely devolved into games of liability hot potato punctuated by outlandish demands from consumer groups. The ASAA measures are a prime example; they are a cleverly crafted legislative laundering of kids’ data. Missing from the scene are honest efforts to make privacy and safety more accessible to parents and kids, and Congress should not miss opportunity to notch one of these wins by making updates to COPPA’s VPC requirements. Therein lie the potential for more accessible parental controls thought by some to be found in the ASAA bills. As policymakers consider kids’ online safety options, they should understand the motivations of those with pre-packaged solutions. Armed with this background legislators must understand that with ASAA, they are being asked to get Meta off the hook for $50 billion worth of fines for years-long kids’ privacy abuses while handing all other developers a bill for $70 billion. This is a raw deal for parents and kids everywhere, and policymakers must reject it.