California lawmakers push ahead with sweeping children's online privacy bill
A new children's online privacy bill is quickly making its way through the California state legislature and could lead to an overhaul of national safety standards for websites likely to be accessed by kids.
The California Age-Appropriate Design Code would require social media platforms to switch off geolocation for children, discontinue "nudging" techniques that trick kids into giving up their information, reduce exposure to harmful content and limit the potential for risky connections with adults.
Co-authored by Assembly members Buffy Wicks, a Democrat from Oakland, and Jordan Cunningham, a Republican from Templeton, the bill is modeled after a recently passed U.K. law. At a time when teenagers spend an average of 8.5 hours online each day, the bill would force social media companies to implement the strongest available safety settings by default for users under 18.
"When you don't have government regulation forcing this as a priority, it becomes an afterthought," Wicks told CBS News. "That's what the regulation can and should do and will do, is to force that conversation at a much higher level within the companies."
More than a dozen bills in Congress
The push to pass children's online safety legislation in California comes as attempts at the federal level have failed to gain momentum. At least 15 bills, several with bipartisan support, are currently circulating in Congress, with aims such as modernizing internet safety standards, making it easier for people to sue big tech companies and creating a data privacy agency. The American Innovation and Choice Act, which focuses on antitrust enforcement of the industry, is the only one that has advanced through a committee vote.
Despite multiple congressional hearings featuring executives from Meta, Twitter, Snapchat, TikTok and YouTube as well as explosive testimony from whistleblower Frances Haugen, progress on these bills has been stunted by other legislative priorities and now seems unlikely as midterm elections loom.
Safety over profit
The California bill says that, in the case of a conflict, social media platforms and all websites that are "likely" to be accessed by kids, must prioritize the best interests of children over their own "commercial interests."
That phrasing recalls Facebook's contentious Senate hearings last year, when lawmakers accused the company of putting profits over safety, a charge the company denied.
"Their own data makes the strongest argument as to why this type of legislation, these types of safeguards, are important," Wicks said.
Meta, which paused its Instagram for Kids project last year after backlash from advocates and lawmakers, told CBS News the company wants to create age-appropriate features, empower teens to take control of their privacy and experience online and include parents in the process. A recently launched Family Center gives parents more access to supervision tools.
Accounts of teenagers on Instagram are defaulted to private mode. In addition to reminders for the "Take a Break" feature on Instagram, Meta said it will soon start nudging teens towards different topics if they dwell on one for a long period.
Teen activists speak out
For Emily Kim, the bill is a welcome change. Kim immediately downloaded Instagram when she got her first phone at 13 "so that I could fit in," she said.
"As I scrolled down looking at my peers' profiles, I found myself staring at my own image, reading countless captions calling me fat and ugly," Kim said at an Assembly hearing last month. Her "online torment" continued after an autoimmune disease led to significant hair loss, she told lawmakers.
"Female classmates would post photos of themselves participating in countless trends that I could not participate in," Kim said, adding that she felt "horrible" even though she didn't post photos of herself.
Kim, now 18, works with LOG OFF, a teen-led digital wellness advocacy group, to inform peers "on the harms of social media and how to use it safely." She advocated in favor of the California bill, saying that legislation is needed "to protect young people from the growing mental and physical dangers."
Privacy by default
Wicks and Cunningham's bill passed unanimously through the Committee on Privacy Consumer Protection in April. It could reach the Assembly floor this month.
Wicks said Britain's new Children's Code is working and if California can successfully follow the same model, "it could have pretty significant repercussions."
According to 5Rights Foundation, a London-based nonprofit that advocated for the U.K. law and supports the California measure, "a wide range of services have made hundreds of changes to their privacy settings" to comply with the U.K. law.
In August, Google made SafeSearch the default option and turned off Location History for users under 18 globally. YouTube turned off the autoplay feature and turned on bedtime reminders by default for those under 18. TikTok also announced enhanced safety features, including disabling direct messages between kids and adults and turning off push notifications after 10 p.m. by default for underage users.
"There is a history of companies taking whatever the strongest state law is on a particular topic and just making that their default nationwide," Eric Null, director of the Privacy and Data Project at the Center for Democracy and Technology, told CBS News.
Currently, websites or services directed at children under the age of 13, have to abide by the Children's Online Privacy Protection Rule of 1998 (COPPA).
Null explained that COPPA focuses on "parents taking action to allow the child to use the website or have the company collect data," whereas the California bill "focuses a lot more on what the companies are, and are not allowed to do."
Unintended consequences?
While there is "a good amount" of positive progress in the California bill, Null cautioned that it could have unintended consequences.
"One of the biggest privacy impacts that this kind of bill will have is essentially every website is going to have to age-gate and collect information about the age of every user they have, so those websites can differentiate between the people they have to treat," Null said. "That requires a lot of data collection on every single user of pretty much every single website," he added.
While Meta and Google did not weigh in on the measure, some industry trade groups are raising concerns.
TechNet and the California Chamber of Commerce, two groups that oppose the bill, said it overreached by including all sites "likely accessed by children," not just those aimed at kids
The groups also claimed that the bill's "new standards for age verification" would force companies to collect more information on users, such as "birthdays, addresses, and government IDs."
The Electronic Frontier Foundation (EFF) told Wicks it cannot support the bill unless it is amended to include only users under 13, in line with federal law. The EFF also said many of the terms in the bill are "vague" and that the enforcement mechanisms remain unclear.
Wicks said "we're working on the enforcement component right now and trying to figure out the best way to do that." She added that the legislation isn't meant to "screw big tech," and said she hoped social media executives would come to support it.
"They're parents, too," Wicks said.