Citing a mental health crisis among young adults, California lawmakers target social media

Carla Garcia said her son’s addiction to social media began in the fourth grade, when he got his computer for virtual learning and logged into YouTube. Now, two years later, the video-sharing site has replaced the schoolwork and activities he used to love — like composing music or having his friends sing at the piano, she said.

“He just has to have his own YouTube,” said Garcia, 56, of West Los Angeles.

Alessandro Greco, now 11 and a soon-to-be sixth grader, watches the videos even as he tells his mom he’s starting homework, making his bed, or playing an instrument. She said that when she confronts him, he gets frustrated and says he hates himself because he feels that watching YouTube is not an option.

Alessandro tells her that he can’t get away from himself, and that he is addicted.

“It’s a sinister thing – they took away my ability to raise children,” Garcia said. “I can’t get over this.”

Some California legislators want to help Garcia and other parents protect their children’s mental health by targeting elements of websites they say are designed to appeal to children — such as personalized posts that attract and lock viewers on a specific page, and frequent push notifications that draw users back to their pages. Hardware and autoplay functions that provide a continuous stream of video content.

Two complementary bills in the state legislature would require websites, social media platforms or online products that children use — or could use — to eliminate features that could addictive them, harvest their personal information, and promote harmful content. Those who do not comply may face lawsuits and heavy fines. One measure imposes penalties of up to $7,500 for each infected child in California — which could run into the millions of dollars.

Federal lawmakers make Similar payment with invoices That would Tighter protection of children’s privacy and targeting features that promote addiction. One may need online platforms to provide tools to help parents track and control their children’s internet use. The measures were approved by a US Senate committee on July 27.

“We have to protect children and their developing brains,” California Society member Jordan Cunningham (R-San Luis Obispo), lead author of both bills and a father of four, said at a committee hearing in June. “We need to end the Big Tech era of unrestricted social experimentation on children.”

But big tech companies remain a formidable enemy, and privacy advocates say they worry that one of California’s measures could increase data intrusion for everyone. Both bills passed the state assembly, but whether they would survive in the state senate is unclear.

technology companieswhich exercise tremendously The Force in SacramentoThey say they are already prioritizing users’ mental health and are making efforts to strengthen age verification mechanisms. They also introduce parental controls and block messages between minors and adults they don’t know.

These laws can violate companies’ rights to free speech and require changes to websites that cannot realistically be engineered, said Dylan Hoffman, CEO of TechNet in California and the Southwest. TechNet – A trade association for technology companies, including Meta (the parent company of Facebook and Instagram) and Snap Inc. (who owns Snapchat) – You oppose these actions.

“It’s an oversimplified solution to a complex problem, and there’s nothing we can suggest that will alleviate our concerns,” Hoffman said of one bill specifically targeting social media.

Last year, the American Surgeon General, Dr. Vivek Murthy, highlighted the state of the nation Youth mental health crisis He cited the use of social media as a potential contributor. Murthy said teens’ use of social media has been linked to anxiety and depression — even before the stresses of COVID-19. Then during the pandemic, he said, the average non-academic teens time in front of screens jumped From approximately four hours a day to approximately eight hours.

“What we’re trying to do, really, is just keep our kids safe,” Assemblyman Buffy Weeks (D-Oakland), another lead author of California Bills and a mother of two, said at the committee’s hearing in June.

One of Cunningham and Weeks bills, AB 2273It will require all online services “likely to be accessed by a child” – which can include most websites – to reduce the collection and use of personal data for users under the age of 18. This includes setting the default privacy settings to the maximum level unless users prove they are 18 or older, and provide terms and service agreements in language a child can understand.

similar to A law enacted in the United KingdomThe procedure also states that companies must “consider the best interests of children when designing, developing and providing such service, product or feature”. This broad wording could allow prosecutors to target companies for features harmful to children. This can include persistent notifications that require children’s attention or suggestions pages based on a child’s activity history that may lead to harmful content. If a state attorney general determines that a company has broken the law, they could face a fine of up to $7,500 for each affected child in California.

other California bill, AB 2408Prosecutors will be allowed to sue social media companies that intentionally addict minors, which could result in fines of up to $250,000 per violation. The original version would have also allowed parents to sue social media companies, but lawmakers removed that ruling in June in the face of opposition from big tech companies.

Together, the California proposals attempt to impose some order on the largely unregulated landscape of the Internet. Jenny Radesky, an assistant professor of pediatrics at the University of Michigan Medical School and a member of the American Academy of Pediatrics, a group that supports data protection law, said if successful, it could improve children’s health and safety.

“If we are going to a playground, you want a place that is designed to allow the child to explore safely,” Radesky said. “However, in the digital playground, there is much less interest in how a child can play there.”

Radesky said she has seen the effects of these addictive elements firsthand. One night, as her 11-year-old son was getting ready for bed, he asked her about a serial killer, she said. He told her he learned the term online when videos about unsolved murder mysteries were automatically recommended after he watched Pokemon videos on YouTube.

Adam Leventhal, director of the University of Southern California’s Institute of Addiction Sciences, said YouTube’s recommendations, and other tools that mine users’ online history to personalize their experiences, are contributing to social media addiction by trying to keep people online for as long as possible. Developing brains, he said, favor exploration and pleasurable experiences over impulse control, so children are especially vulnerable to many social media tricks.

“What social media provides is a very motivating reaction, very fast,” Leventhal said. “Anytime there is an activity where you can have a pleasurable effect and get it quickly and get it when you want it, that increases the likelihood that the activity will be addictive.”

Rachel Holland, a spokeswoman for Meta, explained in a statement that the company has worked alongside parents and teens to prioritize children’s well-being and mitigate the potential negative impacts of its platforms. She noted a variety of the company’s initiatives: In December 2021, for example, it added moderation tools on Instagram that allow parents to see and limit children’s screen time. And in June, it began testing new age verification methods on Instagram, including asking some users to upload a video selfie.

A Snap spokesperson said in a statement that the company is protecting teens with steps that include blocking public accounts for minors and stopping site sharing by default.

Meta and Snape declined to say whether they support or oppose the California bills. YouTube and TikTok did not respond to multiple requests for comment.

Privacy groups raise red flags about actions.

Eric Knoll, director of the Privacy and Data Project at the Center for Democracy and Technology, said a provision in the data protection bill requiring privacy agreements to be written in age-appropriate language would be nearly impossible to implement. “How do you write a privacy policy for a 7-year-old? It seems especially difficult when the child can barely read,” Knoll said.

And because the bill would limit the collection of children’s personal information – but still require platforms accessible to children to collect enough details to verify a user’s age – it could increase data intrusion for all users, he said. “This will incentivize all online companies to verify the age of all their users, which is somewhat illogical,” Knoll said. “You’re trying to protect privacy, but in reality you now need a lot of data collection about every single user you have.”

But Carla Garcia desperately needs action.

Fortunately, she said, her son does not watch violent videos. Alessandro prefers clips from “America’s Got Talent”, “Britain’s Got Talent” and videos of hit wonders. But she said the addiction is real.

Garcia hopes lawmakers will limit tech companies’ ability to constantly send content her son can’t get away from.

“If they can help, then help,” Garcia said. “Make some kind of regulation and stop the algorithm, stop chasing my child.”

Kaiser Health NewsThis article was reprinted from khn.org Courtesy of the Henry J. Kaiser Family Foundation. Kaiser Health News, an editorially independent news service, is a program of the Kaiser Family Foundation, a nonpartisan health care policy research organization not affiliated with Kaiser Permanente.

.