Skip to Main Content
Main Menu
Articles

Understanding the California Age-Appropriate Design Code Act (AB-2273)

California continues to lead the way in the U.S. with laws designed to protect children’s privacy and safety rights by introducing the California Age-Appropriate Design Code Act (the CA Kids Code).

The California Age-Appropriate Design Code Act text (Assembly Bill 2273) was unanimously agreed to and enacted in the Senate on August 30, 2022, and approved by the Governor of California, Gavin Newsom, on September 15, 2022. It will take effect on July 1, 2024.

California’s new online privacy and safety law for children under the age of 18 is modeled on the UK Age-Appropriate Design Code, which became enforceable on September 2, 2021.

As California is home to some of the world’s biggest technology and social media companies, which have already made changes under the UK code, the CA Kids Code is expected to have global influence.

Like the UK code, it is designed to ensure technology companies proactively take a design-by-default approach to protect children’s privacy and safety when creating or updating online services, products, or features that children will likely access.

Unlike the U.S. Children’s Online Privacy Protection Act (COPPA), which provides protections for children aged 13 or under, the CA Kids Code is designed to protect all children under 18 in California.

The Need to Protect Children from Harm Online – Especially on Social Media

Technology companies use sophisticated methods to collect and analyze personal data, then use these insights to keep people engaged longer and influence their behavior.

These data-driven activities certainly help big tech companies generate greater profit, but privacy advocates campaigning against big tech’s over-reach have also found that some activities can cause harm.

For example, AI-driven activity recommendations can expose children to harmful content and advertising, nudge them into risky behaviors and potentially put them at risk of being contacted and/or located by predators.

Groundswell against social media companies’ data management practices

A damning report by international children’s digital rights advocacy organization 5Rights Foundation noted several privacy and safety risks for children using social media platforms, including:

  • 75% of the most popular types of social media have been shown to recommend children’s profiles to strangers via AI suggestions
  • One in three teenage girls’ body image issues were made worse by exposure to content on Instagram – and the company knew about it but did not act, according to leaked documents
  • 6% of US teenagers link their suicide ideation directly to Instagram.

Similarly, an April 2022 survey of nearly 1000 likely voters in California by Accountable Tech and Data for Progress found most people are very concerned for children’s safety online:

  • 71% of likely California voters believe social media platforms are unsafe for children
  • 84% believe the internet is generally unsafe for children
  • 82% believe big technology companies must do more to protect children online.

Act Targets Online Services, Products, or Features Likely to be Accessed by Children

In its definitions of covered businesses, the California Age-Appropriate Design Code Act (AB-2273) extends beyond the reach of COPPA.

COPPA focuses on operators of online services that are directed to children or have actual knowledge they are collecting information from children.

California’s new Act legislates that businesses should “prioritize the privacy, safety, and wellbeing of children over commercial interests” when designing, developing, and providing online services, products, or features “likely to be accessed by children” under 18.

Under the Act, covered businesses include any provider of an online service, product, or feature reasonably expected as likely to be accessed by children because:

  • It is defined as directed to children by COPPA
  • Competent and reliable evidence of its audience age demographics determines it is routinely accessed by a significant number of children
  • Internal company research of its audience age demographics determines children represent a significant part of the audience
  • It is substantially similar to or the same as an existing online service, product, or feature that children routinely access
  • It displays ads marketed to children
  • It has design elements that appeal to children, such as images of cartoon characters or celebrities, games, and music.

Exemptions

The Act’s definition of online service, product, or feature does not mean a broadband internet access service or telecommunications service, nor the delivery or use of a physical product.

Compliance Obligations Include Data Protection Impact Assessments

The Act ultimately aims to ensure businesses mitigate and eliminate privacy and safety risks for children at the design stage of online services, products, or features – before children can access them.

Data protection impact assessments are a key protective requirement that businesses must conduct and document for any new online service, product, or feature likely to be accessed by children before it is offered to the public.

These assessments must also be maintained as long as the online service, product, or feature is available.

In each data protection impact assessment, businesses must identify:

  • The purpose of the service, product, or feature
  • Whether it collects children’s personal information and how it uses this information
  • The risks of harm to children that the data management practices of the business could cause.

Under the Act, risks of harm include:

  • Contact by predators
  • Exposure to or subject to exploitation or other harmful conduct
  • Exposure to ads or content that could cause harm, such as promoting activities that are risky or prohibited for children to participate in (such as gambling or consuming alcohol)
  • Any design feature that aims to increase, sustain or extend the use of the online product, service, or feature by children, such as media autoplay features, notifications, or rewards for time spent
  • Any content that could negatively impact children’s wellbeing.

California Age-Appropriate Design Code Compliance Requirements

Along with obligations such as data protection impact assessments, all covered businesses that allow an online service, product, or feature to be used by children must meet the following requirements:

  • Estimate the age of child users with a reasonable level of certainty appropriate to the risks that arise from the business’s data management practices, or apply the privacy and data protections afforded to children to all users
  • Automatically configure all default privacy settings for children to the highest level of privacy available unless the business can show a compelling reason that a different setting is in children’s best interests
  • Prominently display privacy information, terms of service, policies, and community standards in clear, concise text suited to the identified age group/s of children in the audience – and enforce those terms, policies, and community standards
  • Give children an obvious signal they are being monitored or tracked if the online service, product, or feature allows parents, guardians, or any other consumers to monitor children’s online activity or track their location
  • Help children exercise their privacy rights and report concerns with tools that are easy for them to find, access, and use; if applicable, make these tools available to children’s parents or guardians.

Prohibited Activities Under the California Age-Appropriate Design Code Act

Covered businesses are prohibited from taking any of the following actions:

  1. Profiling a child by default – there are some exceptions to this, but only when the profiling can be proven to be in the best interests of children and/or is necessary to provide requested online services, products, or features with which the child is actively and knowingly engaged.
  2. Using the personal information of any child in a harmful way – this includes any way that the business knows, or has reason to know, is materially detrimental to a child’s physical health, mental health, or wellbeing.
  3. Unnecessarily collecting, selling, sharing, or storing any personal information from or about a child – ‘necessary information’ must be proven to be needed to provide the online service, product, or feature. (Note: there are several rules with further restrictions on the collection and use of types of personal information, such as geolocation data.)
  4. Using dark patterns to lead or encourage children to forego privacy protections or take harmful action – includes any action that the business knows, or has reason to know, is materially detrimental to the child’s physical health, mental health, or wellbeing.

New California Privacy Protection Agency law enforcement powers

The California Privacy Protection Agency, formed under the California Consumer Privacy Act (CCPA), will gain extended law enforcement powers to ensure compliance with the California Age-Appropriate Design Code Act.

Enforcement will be directed by the California Attorney General (AG), with the power to pursue injunctions and/or civil penalties against violating businesses.

Civil Penalties

When pursuing civil penalties for violations, the AG will consider whether the violation is negligent (failure to properly meet requirements of the Act) or intentional (conducting prohibited activities and/or deliberate non-compliance with requirements).

The penalties for each violation are:

  • Negligent violation – penalties up to $2500 per affected child
  • Intentional violation – penalties up to $7500 per affected child.

Potential 90-day cure period to meet compliance

The current text of the California Age-Appropriate Design Code Act also allows the AG to offer a 90-day cure period for some businesses before pursuing civil penalties.

This cure period is only available if the AG determines a business is already in substantial compliance with the requirements of the Act (paragraphs 1-4 inclusive of subdivision (a) of Section 1798.99.31).

The California Children’s Data Protection Working Group

The Act also created a working group to advise government and businesses on best practices for prioritizing children’s best interests (privacy and safety) online.

This working group will consist of Californians with related expertise in two or more areas (for example, children’s data privacy and mental health) appointed by several government leaders and bodies, including the California Privacy Protection Agency.

Key Topics

Get the latest resources sent to your inbox

Subscribe
Back to Top