California Age Appropriate Design Code Act (AB 2273): What You Need to Know

By Morgan Sullivan

Senior Content Marketing Manager II

September 23, 20229 min read

Share this article

At a glance

  • Assembly Bill 2273, or the California Age Appropriate Design Code Act (CAADC), was signed into law on September 15, 2022.
  • This new privacy law focuses specifically on protecting children’s privacy, data, and wellbeing online—setting a new legal standard for businesses in California.
  • Though the bill doesn't go into effect on July 1, 2024, the CAADC is quite broad—so organizations who target their products or services to minors would be wise to start preparing for compliance now. 
  • Keep reading to explore what’s required, what’s prohibited, and what businesses need to do to prepare for the California Age Appropriate Design Code Act.

Table of contents

The basics

Though social media platforms are a clear focus of this bill, CAADC applies to a wide range of products and services including online games, voice assistants, connected toys, and even digital learning tools for schools. The bill’s language is quite broad when it comes to scope, stating that it applies to any “online service, product, or feature likely to be accessed by children.”

The text defines children as minors aged 18 or under.

At its core, the bill requires companies to prioritize children’s safety and privacy online above their own commercial interests. The bill also compels companies to consider and address how their product or service might impact a child’s development or physical and mental health. 

Though the language of the bill is broad (and according to some critics vague) the overall takeaway is that companies who target their products to children and teenagers need to be aware of what CAADC requires and start taking steps towards compliance. 

Requirements of the CAADC

The CAADC requires that businesses take certain proactive steps to ensure that children’s privacy and safety is protected when using their products. We explore the seven main requirements below. 

Complete a data protection impact assessment (DPIA)

Companies who fall under the purview of AB2273 must complete a data protection impact assessment (DPIA) before offering a new online service or product to minors. 

AB2273 defines a DPIA as: 

“a systematic survey to assess and mitigate risks that arise from the data management practices of the business to children who are reasonably likely to access the online service, product, or feature at issue"

AB 2273, 1798.99.30.(b)(2)

When conducting this DPIA, your organization must consider and document:

  • If your product or service will expose children to harmful content or allow them to be targeted by dangerous individuals
  • What risks your algorithms and/or targeted advertising present to children
  • Use of coercive design elements within your product or service, especially those intended to increase time spent on the platform, compel sharing of personal information, or encourage lower privacy controls
  • The extent to which sensitive personal information is being collected and processed

DPIA’s must be maintained as long as the product or service is accessible by children and must be reviewed on a biannual basis.

Mitigate and eliminate risks identified in your DPIA

Once the DPIA is complete, your organization must create a clear plan (including timelines) for addressing and eliminating all of the risks identified in your data protection assessment. These risks must be eliminated in full before your product or service may be accessed by children.

Make your DPIA’s available upon request

While companies are not required to proactively submit DPIAs, they must make them available within five business days of a written request from the California Attorney General (AG). 

The AG may also request a list of all your completed DPIAs, which must be made available within three days.

Any DPIA submitted to the AG will be protected as confidential and will not be made public.

Estimate the age of child users

The language in this statute is quite broad, stating that companies must:

"Estimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the data management practices of the business or apply the privacy and data protections afforded to children to all consumers.”

AB 2273, 1798.99.31(5)

No further clarification is offered on what defines a “reasonable level of certainty” or how businesses should evaluate what might be “appropriate to the risks.”

The second half of the text, “or apply the privacy and data protections afforded to children to all consumers,” offers a bit more guidance for organizations looking to take a practical step towards compliance. 

If estimating ages to a reasonable level of certainty feels too sticky or amorphous, one potential path is to offer the highest level of privacy and data protection to all users of your product or service.

Set high default privacy settings for minors

Unless your business can make a “compelling” case that low-privacy defaults serve children’s best interests, your product or service must be configured to the highest privacy settings by default.

Prominently disclose privacy information

AB 2273’s language on privacy disclosures is pretty straightforward: 

“Provide any privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of children likely to access that online service, product, or feature.”

AB 2273, 1798.99.31.(a)(7)

Essentially, don’t use legalese, unnecessarily complex language, or interfaces to hide or otherwise muddle your privacy policies.

Clearly indicate when a child’s location or activity is being monitored 

Some products and services allow monitoring by a parent, guardian, or teacher (think educational services), or track location in order to fulfill a task (ridesharing, maps, etc). If your product monitors a child’s activity or tracks their location, you must clearly indicate this to the child while it’s occurring.

Outside of these requirements, there are also several things that the CAADC expressly prohibits. 

Prohibited activities under the CAADC

The CAACD provides clear boundaries in terms of what businesses aren’t allowed to do. 

Though these mandates are fairly broad, one good rule of thumb when considering compliance for these requirements is to practice data minimization whenever possible. This means limiting the data you collect, as well as how long you store/retain it.

But data minimization isn’t the only thing businesses need to consider. Under the CAADC, businesses may not:

Use personal information in a way that harms a child’s health or well-being

Outside the obvious, the text here does have one interesting nuance, stating that businesses may not: 

“Use the personal information of any child in a way that the business knows, or has reason to know, is materially detrimental to the physical health, mental health, or well-being of a child.”

AB 2273, 1798.99.31.(b)(1)

The line that stands out here is “or has reason to know.” This language could, potentially, be used to waylay an ignorance defense. A company may claim they didn’t know of the harm their product or service was causing, but if the AG can make the case they ought to have known—the allegations may hold more weight. 

However, this is purely speculative and enforcement precedent for this law is a long way off.

Profile minors by default

AB2273 defines profiling as: 

“any form of automated processing of personal information that uses personal information to evaluate certain aspects relating to a natural person”

AB 2273, 1798.99.30.(b)(6)

This includes using personal information to profile or predict someone’s preferences, behavior, location, movements, interests, economic situation, or health. 

The CAADC explicitly prohibits companies from using automated profiling on children by default, but there are some exceptions:

  • If the company has demonstrated there are clear safeguards that protect children
  • The profiling is strictly necessary for providing the product or service
  • The business has a “compelling” reason the profiling is in a child’s best interest

Collect, use, and share data without a legitimate reason

At its core, AB2273 compels companies to take a data minimization approach, prohibiting: 

  • Using personal info for any other reason than the stated reason
  • Collecting, sharing, selling, or retaining information unnecessarily 
  • Collecting precise location by default or without prominent notification while location tracking is occurring
  • Using information meant to estimate age for any other purpose or retain that info for any longer than is necessary

Use of dark patterns

Similar to the latest round of CPRA rulemaking, CAADC explicitly prohibits the use of dark patterns, stating that businesses may not:

“use dark patterns to lead or encourage children to provide personal information beyond what is reasonably expected to provide that online service, product, or feature to forego privacy protections, or to take any action that [...] is materially detrimental to the child’s physical health, mental health, or well-being.”

AB 2273, 1798.99.31(b)(7)

For reference, CPRA draft regulations define dark patterns as: 

“A user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision making, or choice.“

California Privacy Rights Act, 1798.140(I)

With these requirements and restrictions in mind, there are a few key steps companies can take now to start working towards CAADC compliance.

5 Steps to Prepare for CAADC Compliance

Create a data map

As with any privacy law, completing a comprehensive data map is your first step towards compliance. Though the CAADC doesn’t explicitly require data mapping, it’s key to understanding where, when, and how you’re collecting data—you can’t manage what you can’t see. 

With your data map complete you can start to effectively audit your data collection and processing activities against the requirements laid out by the CAACD.

Complete a DPIA by July 1, 2024

Though you aren't required to submit a DPIA unless expressly requested by the California AG, you still must complete one by the enforcement deadline. The bill’s language is quite clear: 

“A business shall complete a Data Protection Impact Assessment on or before July 1, 2024, for any online service, product, or feature likely to be accessed by children offered to the public before July 1, 2024.”

For any product or service that will be discontinued before July 1, 2024, a DPIA is not required.

Audit your interfaces for dark patterns

One reason to focus on this piece of compliance early is that, aside from the moral implications of using coercive interface design on children, dark patterns are consumer facing—so they’re very easy to audit. 

In the Attorney General’s initial enforcement sweep for CCPA, many of the notices of violation revolved around issues that were immediately visible on a businesses website. Dark patterns fall into that same category.

In fact, European privacy watchdog noyb has sent hundreds of dark pattern complaints to businesses—filing full complaints with various EU regulators if their concerns were not addressed within 30 days. 

Dark patterns are low hanging fruit when it comes to enforcement, so auditing your website early and often will go a long way towards moving your business towards demonstrated compliance. 

Ensure your product meets the necessary technical specifications 

Some of the CAADC’s requirements do require a technical lift i.e. making changes within your product or service itself. The top three items that stand out in terms of technical changes are: 

  • Defaulting to the highest level of privacy
  • Notifying users of location data collection
  • Estimating age to reasonable certainty

All of these will involve some level of engagement from your engineering team, so make sure to work these into your 2023 product roadmap.

Implement processes for data minimization and deletion

Data minimization is quickly becoming table stakes for privacy compliance—and the requirements within CAADC are no different. Many of the items on the ‘No’ list revolve around when, why, and for how long data is collected, processed, and stored. 

Data minimization can be a complex topic, as many companies have only a limited idea of how they collect data and where it’s stored. But one way to think about it is—if you don’t need to be collecting certain data, don’t do it. 

This is another area where a data map comes in handy, helping you identify whether the data you collect is necessary to providing your product or service or whether it’s superfluous.


About Transcend

Transcend is the company that makes it easy to encode privacy across your entire tech stack. Our mission is to make it simple for companies to give users control of their data.

Automate data subject request workflows with Privacy Requests, ensure nothing is tracked without user consent using Transcend Consent, or discover data silos and auto-generate reports with Data Mapping.


By Morgan Sullivan

Senior Content Marketing Manager II

Share this article