Citing a mental health crisis among young people, California lawmakers are targeting

Carla Garcia said her son’s social media addiction began in fourth grade, when he took his computer and logged onto YouTube for virtual learning. Now, two years later, the video-sharing site has replaced both schoolwork and activities she loved — like composing music or serenading her friends on the piano, she said.

“He just has to have his YouTube,” said Garcia, 56, of West Los Angeles.

Alessandro Greco, now 11 and a soon-to-be sixth grader, watches videos even as he tells his mother he’s starting homework, making his bed or practicing his instrument. When she confronts him, she said, he gets frustrated and says he hates himself because he doesn’t like watching YouTube.

Alessandro tells him he can’t pull himself away, he’s addicted.

“It’s wicked — they’ve taken away my parental authority,” Garcia said. “I can’t beat it.”

Some California lawmakers want Garcia and other parents to help protect their children’s mental health by targeting website elements they say are designed to hook kids — such as personalized posts that capture and keep visitors on a specific page, frequent push notifications. which brings users back to their device, and autoplay function which provides a continuous stream of video content.

A boy stands outside and looks at the camera, smiling.
Alessandro Greco became addicted to YouTube when he was 9 and would watch videos, make beds or practice piano instead of doing homework, said his mother Carla Garcia. The West Los Angeles boy, now 11, told her he couldn’t move himself. “He can’t stop,” she says. “No matter how hard I try, it won’t stop him.” (Carla Garcia)

Two complementary bills in the state legislature would require websites, social media platforms or online products that children use — or may use — to remove features that can become addictive, collect their personal information and promote harmful content. Those who do not comply may face prosecution and heavy fines. One of the measures would impose fines of up to $7,500 per affected child in California — which could amount to millions of dollars.

Federal lawmakers are making a similar push with bills that would tighten privacy protections for children and target features that encourage addiction. Parents will need an online platform to help them track and control their children’s internet usage. The measures were approved by a committee of the US Senate on July 27.

“We need to protect children and their developing brains,” said California Assemblyman Jordan Cunningham (R-San Luis Obispo), the lead author of both bills and a father of four, at a committee hearing in June. “We must end the era of Big Tech’s relentless social experiment on children.”

But Big Tech remains a formidable foe, and privacy advocates say they worry that a move in California could increase data intrusion for everyone. Both bills have cleared the state legislature, but it’s unclear whether they will survive the state senate.

The tech company, which has huge power in Sacramento, says it already prioritizes users’ mental health and is trying to strengthen age verification processes. They are also introducing parental controls and banning messages between minors and adults they don’t know

But those bills could infringe on the company’s free speech rights and require changes to websites that cannot realistically be engineered, said Dylan Hoffman, Technet’s executive director for California and the Southwest. TechNet — Meta (parent company of Facebook and Instagram) and Snap Inc. (which owns Snapchat) — a trade association for tech companies — opposes the measure

“This is an oversimplified solution to a complex problem, and there’s nothing we can propose that will address our concerns,” Hoffman said of a bill that specifically targets social media.

Last year, the US Surgeon General, Dr. Vivek Murthy, highlighted the mental health crisis among the nation’s youth and pointed to social media use as a possible contributor. Murthy said social media use among teenagers was linked to anxiety and depression — even before the stress of Covid-19. Then during the pandemic, he said, the average amount of non-academic screen time among teenagers rose from about four hours a day to about eight.

“What we’re trying to do, really, is keep our kids safe,” Assemblywoman Buffy Weeks (D-Oakland), another lead author of the California bill and a mother of two, said at a June committee hearing.

One of Cunningham and Weeks’ bills, AB 2273, would require all online services “accessible by a child” — which could include most websites — to limit the collection and use of personal data from users under 18. This includes setting the default privacy settings to the highest level until users prove they are 18 or older and providing the terms and service agreement in a language that a child can understand.

Modeled after a law passed in the United Kingdom, the measure also states that companies should “consider the best interests of children when designing, developing and providing services, products or features.” This broad phrase could allow prosecutors to target companies for characteristics harmful to children. This may include persistent notifications that demand children’s attention or advice pages based on a child’s activity history that may lead to harmful content. If the state attorney general determines that a company violated the law, California could face fines of up to $7,500 per child affected.

The other California bill, AB 2408, would allow prosecutors to prosecute social media companies that knowingly drug minors, leading to fines of up to $250,000 per violation. The original version would have allowed parents to sue social media companies, but lawmakers removed that provision in June in the face of opposition from big tech.

Together, the two California proposals attempt to impose some order on the largely unregulated landscape of the Internet. If successful, they could improve children’s health and safety, said Dr. Jenny Radesky, an assistant professor of pediatrics at the University of Michigan Medical School and a member of the American Academy of Pediatrics, a group that supports the data protection bill.

“If we’re going to have a playground, you want a place that’s designed to let a child explore safely,” Radesky said. “Yet in the digital playground, there is much less focus on how a child can play.”

Radesky says he’s seen the effects of these addictive ingredients firsthand. One night, as her then 11-year-old son was getting ready for bed, she asked him what a serial killer was, she said. He told her he learned the term online when videos about unsolved murder mysteries were automatically recommended to him after he watched Pokemon videos on YouTube.

Adam Leventhal, director of the University of Southern California’s Institute for Addiction Science, said YouTube’s recommendations and other tools that use users’ online history to personalize their experience contribute to social media addiction by trying to keep people online as long as possible. Because developing brains favor exploration and pleasurable experiences over emotion regulation, kids are especially sensitive to many of social media’s tactics, she said.

“What social media offers is a very stimulating, very quick response,” says Leventhal. “Anytime there’s an activity where you can get a pleasurable effect and get it fast and get it when you want it, it increases the likelihood that an activity will become addictive.”

Rachel Holland, a spokeswoman for Meta, explained in a statement that the company has worked alongside parents and teenagers to prioritize children’s well-being and minimize the potential negative effects of its platforms. He pointed to the company’s various initiatives: For example, in December 2021, it added supervision tools to Instagram that allow parents to see and limit children’s screen time. And in June, it began testing new age verification techniques on Instagram, including asking some users to upload a video selfie.

Snap spokesman Pete Boogaard said in a statement that the company is protecting teens with measures that include banning public accounts for minors and turning off location-sharing by default.

Meta and Snap declined to say whether they support or oppose California’s bill. YouTube and TikTok did not respond to multiple requests for comment.

Privacy groups are raising red flags about the arrangement.

Eric Null, director of privacy and data projects at the Center for Democracy and Technology, said the data protection bill’s provision requiring privacy agreements to be written in age-appropriate language is nearly impossible to implement. “How do you write a privacy policy for a 7-year-old? It seems like a particularly difficult thing to do when the child can barely read,” Nall said.

And because the bill would limit the collection of children’s personal information — but still require platforms that children can collect enough details to verify a user’s age — it could increase data penetration for all users, he said. “It’s going to further encourage all online companies to verify the age of all their users, which is somewhat counterproductive,” Nall said “You’re trying to protect privacy, but actually now you need to collect a lot more data about each user.”

But Carla Garcia is desperate for action.

Fortunately, she said, her son does not watch violent videos. Alessandro likes clips and one-hit wonders from “America’s Got Talent” and “Britain’s Got Talent”. But the addiction is real, he said.

Garcia hopes lawmakers will curtail tech companies’ ability to keep sending her son content he can’t turn away from.

“If they can help, help,” Garcia said. “Put in some sort of regulation and stop the algorithm, stop preying on my child.”

This story was produced by KHN, which publishes California Healthline, the editorially independent service of the California Health Care Foundation.

Related topics

Contact us Submit a story tip

Leave a Reply

Your email address will not be published.