California Governor Signs Online Child Protection Bill into Law
September 20, 2022
Governor Gavin Newsom signed the California Age-Appropriate Design Code Act into law last week, making his state the first in the nation to adopt online child safety measures. The bipartisan legislation requires online platforms to default to privacy and safety settings that protect children’s mental and physical health. The new law, cosponsored by Assemblymembers Buffy Wicks (D-15th District) and Jordan Cunningham (R-35th District), prohibits companies that provide online services and products in California from using a child’s personal information and forbids collecting, selling, or retaining a child’s geolocation, among other things.
“We’re taking aggressive action in California to protect the health and wellbeing of our kids,” Newsom said in an announcement on Thursday. “As a father of four, I’m familiar with the real issues our children are experiencing online, and I’m thankful to Assemblymembers Wicks and Cunningham and the tech industry for pushing these protections and putting the wellbeing of our kids first.” The bill, AB 2273, was advanced to his desk last month, with the new law set to go into effect in 2024.
The California Age-Appropriate Design Code Act “will require makers of social media apps like Facebook, Instagram and TikTok to study products and features that are likely to be accessed by minors and mitigate any potential harm before releasing them to the public,” reports The Wall Street Journal.
It will “require companies to craft their privacy policies in language that kids can understand and prohibit the profiling of children and the use of tools that encourage kids to share personal information.”
It also requires businesses with an online presence to complete a Data Protection Impact Assessment before offering new online services, products, or features likely to be accessed by children.
Social media platforms opposed the bill, claiming a patchwork of state-enacted laws regulating their global apps would make U.S. compliance difficult. Proponents say it’s likely that companies will end up applying the California standard nationwide for ease of adoption.
A similar measure went into effect in the UK in 2021. As British regulators were developing that comprehensive effort, “Google, YouTube, Instagram, TikTok, Snap and other major platforms announced new safeguards for younger users worldwide,” reports The New York Times.
The measure “could apply to a wide range of popular digital products that people under 18 are likely to use: social networks, game platforms, connected toys, voice assistants and digital learning tools for schools,” writes NYT, noting “industry groups had opposed the measure, saying its scope was too broad and its provisions too vague to carry out,” while lobbyist TechNet unsuccessfully urged California legislators to define “child” as under 16.
Although a provision that would have allowed parents to sue online companies was removed, the new law authorizes the California attorney general to file suit seeking an injunction or civil penalties of up to $7,500 per affected child, plus penalties and legal fees from companies that violate the new rules.
The U.S. Congress “is also working to boost online safety for youngsters,” says NYT, noting that “in July, the Senate Committee on Commerce, Science & Transportation advanced a privacy bill that would prohibit online services from targeting teens and children with ads or collecting their personal data without permission.”
Separately, the Commerce Committee advanced a bill that “would require online services to protect minors by, among other things, not recommending they view potentially harmful content on suicide or eating disorders.”
No Comments Yet
You can be the first to comment!
Sorry, comments for this entry are closed at this time.