Victoria Hinks watched her daughter get sucked into the dark sides of social media, and she couldn’t pull her out.
“We tried to take her phone away at night, but it was like taking a drug away from an addict,” Hinks said at a news conference at the Boys & Girls Clubs of San Francisco’s Don Fisher Clubhouse on Monday.
Hinks, whose 16-year-old daughter died by suicide in August, joined California Atty. Gen. Rob Bonta and Assemblymember Rebecca Bauer-Kahan (D-Orinda) in announcing proposed legislation that would require social media companies to warn California users their platforms could pose risks to the mental health and well-being of young people.
The effort to add warning labels is the latest in a series of moves by state lawmakers to bolster online protections for children. Bonta and Bauer-Kahan, who introduced the new legislation, Assembly Bill 56, expect they will face pushback from tech industry groups that have sued to stop new child safety laws from being enforced.
Although supporters acknowledge warning labels wouldn’t be a cure-all, lawmakers and child advocates say the labels would help parents decide whether they should allow their kids to use these popular services. Bonta, Bauer-Kahan and Common Sense Media Chief Executive and founder Jim Steyer compared the proposed labels to putting warnings on cigarette cartons.
“It will raise public awareness and turn the tide in this public health crisis,” Bauer-Kahan said.
The move comes after U.S. Surgeon General Vivek Murthy also called for warning labels on social media this year. In an op-ed published in the New York Times, Murthy said that putting a label on these online services would remind parents and young people about social media’s potential dangers.
Last year, the surgeon general published a report stating that while social media can have some benefits such as connecting young people to family and friends, the platforms also pose potential risks such as depression, anxiety, social comparison and body image issues.
Social media companies have been adding features to give parents more control over their children’s use of social media. Meta Platforms-owned Instagram, a social media app popular among young people, introduced teen accounts this year so parents can limit the content their teens see and who contacts them online.
Google, TikTok, Snap and NetChoice, a trade group backed by major tech companies, didn’t respond to requests for comment. Meta didn’t immediately have a statement about the proposal.
The California attorney general also sued TikTok and Meta over alleged harms to young people.
Efforts to protect kids online have faced several legal roadblocks as tech industry groups sue to block new laws from being enforced, alleging the new laws violate free speech protections under the 1st Amendment.
This year, a federal appeals court partly upheld a lower court’s decision to block a California online child safety law passed in 2022. Known as the California Age-Appropriate Design Code Act, the law requires online platforms to assess whether the design of their product, service or feature could harm children before they’re released to the public.
Bonta said there’s no 1st Amendment right to harm children and his office will battle it out in court.
“The fact that we might get sued down the road after an important bill that protects our children is passed will not slow us down,” Bonta said.
Hinks echoed Bonta’s comments, noting that adding warning labels is a step in the right direction. Despite using parental controls to limit the amount of time her daughter spent on social media apps, Hinks said her daughter was still served content about eating disorders and self-harm. Convinced she wasn’t pretty enough, the teen used beauty filters offered on various apps to change her appearance, her mom said.
“There is not a bone in my body that doubts social media played a role in leading her to that final, irreversible decision,” Hinks said.