Back in March, when we wrote about the State of Children’s Privacy, we flagged California’s AB-2273, the Age-Appropriate Design Code Act (AADC) as a key piece of legislation for developers to monitor. The AADC framework marks a significant departure from the sort of “collection standard” law (such as the Children’s Online Privacy Protection Act [COPPA]) typically associated with the kids’ privacy arena. True to recent form, the California legislature moved incredibly quickly to pass the bill, which Governor Gavin Newsom recently signed into law. Now the attention turns to Congress and several states (including New York) that each have versions of legislation that directly or indirectly draw from the California AADC (which itself finds inspiration from the standards-based Age Appropriate Design Code in the United Kingdom).
Here is what you need to know about the new law (which goes into effect July 1, 2024) and its growing list of legislative companions in Congress and the states.
Coverage
The first thing to consult regarding a business’ potential coverage under the California AADC, is, somewhat counterintuitively, the California Privacy Rights Act (CPRA). While the AADC is not considered an amendment to the CPRA, it does borrow several key terms from that law, including coverage thresholds. So, if your business is considered “covered” under the CPRA and it provides online products, services, and features that are likely to be accessed by a child (defined as under 18 years old – a notably different standard than COPPA’s under 13 threshold), then this new law covers it. What does it mean to have a product, service, or feature that is likely to be accessed by a child? The law includes a six-factor checklist, including, but not limited to, whether the service is directed to children as defined by COPPA, whether a “significant” number of children access the service, and the child-directedness of marketing materials, or design elements.
Best Interests of the Child Standard
In previous versions of the legislation, businesses were required to consider the best interests of children when designing, developing, and providing any online service, product, or feature likely to be accessed by children. That requirement was removed from the operative portion of the bill before becoming law, though a recommendation that covered businesses consider the best interests of children remains in the non-binding legislative findings section, while the rest of the law details certain required default data protection standards.
For example, covered businesses must configure all default privacy settings to offer a high level of protection unless they can demonstrate another setting would be in the best interest of the child, and they may not profile children by processing their data to predict behavior. Covered businesses must also not collect, sell, or share precise geolocation data unless it is strictly necessary for the operation of the service, only occurs during the operation of the service, and the collection is obviously communicated to the child. Covered businesses are also forbidden from collecting, selling, sharing, or retaining any personal information that is not necessary to provide an online service, product, or feature with which a child is actively and knowingly engaged, unless they can demonstrate that such activities are in the best interests of children.
Data Impact Assessments
On top of applying the overarching best interest standards to its various processing activities, covered businesses will also have to complete a data protection impact assessment (DPIA) for each online service, product, or feature likely to be accessed by children. Among other more specific requirements, the DPIA “shall identify the purpose of the online service, product, or feature, how it uses children’s personal information, and the risks of material detriment to children that arise from the data management practices of the business.” If any risk of “material detriment to children” arise from data management practices identified in the DPIA, the business must “create a timed plan to mitigate or eliminate the risk.” Any DPIA must be made available, within five business days, to the Attorney General pursuant to a written request.
Age Gating
Perhaps most impactful to the experience of everyday internet users is the requirement that covered businesses estimate the ages of child users with a “reasonable level of certainty appropriate to the risks that arise from the data management practices” or else “apply the privacy and data protections afforded to children to all consumers.” So how will businesses accomplish this difficult task? For starters, developers wishing to determine the ages of their child users with a “reasonable level of certainty” may have to collect the age information of all of their users, which will then allow them to separate those who are 18 and older from the rest. Moreover, the legislative findings attached to the legislation state that businesses should sort their users into buckets (0 to 5 years, 6 to 9 years, 10 to 12 years, 13 to 15 years, and 16 to 17 years) that will each carry separate treatment depending on the risk of a given product, service, or feature. Unless app developers decide to apply children’s data privacy protections to all users, they will need to use an age gate or age verification technology before allowing users to access content, which will entail a significant increase in data collection and security responsibilities for small developers.
Enforcement
The California Age-Appropriate Design Code vests the California Attorney General’s Office with sole authority, unlike the CPRA, in which the California Privacy Protection Agency and Attorney General share joint enforcement authority. The law holds violators liable for a civil penalty of not more than $2,500 per affected child for each negligent violation or not more than $7,500 per affected child for each intentional violation. Before initiating an action, the Attorney General must provide businesses in “substantial compliance” with the law a 90-day period to cure the alleged violation. Businesses not deemed to be substantially compliant (a term not defined in the law) will not be able to cure alleged violations. The law also creates the California Children’s Data Protection Working Group to deliver a report identifying compliance best practices and standards by January 1, 2024, and every two years after. Once again, the law takes effect July 1, 2024.
Next Steps
App developers should remain attentive to developments surrounding children’s privacy legislation. Many developers, even those that might not view themselves as catering to an audience primarily consisting of children will have to reexamine practices and features in light of the “likely to be accessed by a child” standard and the influence and reach of the California market.
More broadly, amid the increased attention from both Democratic and Republican lawmakers on the practices of (mostly) large social media companies, it’s likely that the temptation to move away from data collection standards into the regulation of design practices of tech companies will remain for the foreseeable future. In fact, some early drafts of state-level Age-Appropriate Design Codes (such as in New York) go even further than California, requiring that a designated government agency vet individual DPIAs before the company offers the given product, service, or feature to the public. And despite the fact that much of the consternation regarding surveillance advertising-based business models that motivates these proposals stems from the practices of larger firms many of the proposed bills do not scope the requirements on covered businesses based on size, revenue, or market penetration. For example, the Kids Online Safety Act in Congress (unanimously approved by the Senate Commerce Committee earlier this year) introduces a statutory duty of care standard that would apply to any “commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor” (defined as under 16 in this bill).
California often serves as a bellwether for consumer protection legislation, especially relating to the digital economy (see the CCPA/CPRA), so while the enactment of further sweeping bills at the state and federal level is unlikely for the remainder of this year, it is safe to assume that by early 2023 several more copycat efforts will emerge. As with state privacy efforts, it won’t take much for any one of them to gain traction.