Last fall, California lawmakers passed a sweeping online children’s privacy law aimed at regulating how some of the most popular social media and video game platforms treat minors.
Many children’s groups heralded the measure, the first of its kind in the United States. So did Gov. Gavin Newsom. “We’re taking aggressive action in California to protect the health and well-being of our kids,” he said in an statement at the time.
But last month, after a lawsuit filed by a tech industry group whose members include Meta and TikTok, a federal judge in California preliminarily blocked the law, saying it “likely violates” the First Amendment.
The judge’s decision was a blow to lawmakers, governors, children’s groups and parents across the United States hoping to curb the lure that platforms like TikTok, Instagram and YouTube hold for many children and teenagers. It was the latest courtroom setback for backers of new state laws designed to limit how online services are allowed to interact with young people.
In August, a federal judge in Arkansas temporarily blocked a new law in that state that would require certain social media platforms to verify the ages of their users and obtain parental consent before allowing minors to create accounts.
That same month, a federal judge in Texas temporarily blocked a new anti-pornography law that would restrict access to content deemed harmful to minors. It would require sexually explicit sites to verify that their users were 18 or older and display health warnings before allowing users to see content.
Some legislators and state officials behind the new social media and pornography age-verification laws said they expected to hit temporary roadblocks. In the late 1990s and early 2000s, the Supreme Court overturned similar laws intended to shield children online, saying they could hinder adults and young people from having access to large parts of the internet.
But the legislators behind the California law, the California Age-Appropriate Design Code Act, designed their measure differently. It does not require age verification. It requires online services to design their sites and apps to minimize potential risks for younger users — like exposing them to explicit content or using powerful techniques that prod them to spend hours on end online.
“It’s concerning that as children’s advocates we’re so outgunned just to get legislation passed, and then to have the judge side with more well-financed industry arguments,” said Josh Golin, the executive director of Fairplay, a nonprofit children’s group that backed the California law.
The effort to shield children online is a microcosm of a much larger battle to control the future of the internet. It pits tech giants, tech trade groups and free speech activists against activist governors, lawmakers, progressive children’s groups and conservative parents’ rights groups.
Even the U.S. surgeon general has weighed in. In a recent report, he urged policymakers to strengthen age minimums and “further limit access” to social media “for all children.”
While some members of Congress are still pushing for federal bills to insulate children online, state legislators have passed measures at an astonishing pace. This year, Republican-led states, including Utah and Arkansas, have passed at least 10 laws restricting minors’ access to social media and online porn sites. Democratic-led states, including California, have also passed new social media laws.
The lawsuit over the California children’s privacy law could have wide repercussions for many other states that have enacted or are pursuing tech regulations.
The case was brought last year by NetChoice, a tech industry group whose members include Amazon, Google and Snap. In its complaint, the group argued that the measure would hinder the free speech rights of companies to publish information online. (The New York Times and the Student Press Law Center jointly filed a friend-of-the-court brief in the case.)
Chris Marchese, the director of NetChoice’s litigation center, said the law — which requires online services to minimize harm to minors — essentially meant that tech companies would need to sanitize the internet on behalf of young people. He added that restrictive age-verification laws passed in other states could have similar effects.
NetChoice has also sued to stop the law in Arkansas requiring age verification and parental consent for minors to create social media accounts.
The California law, “if it were to take effect, would allow the attorney general to go after platforms that don’t remove harmful speech,” Mr. Marchese said. “Where in Texas, they might go after platforms for allowing minors to access content related to trans rights or to transitioning or to L.G.B.T.Q. issues or to abortion issues.”
Backers of the California children’s privacy law strongly disagreed. They noted that the measure required online services to turn on the highest privacy settings by default for children. And they argued that the measure regulated product features, not free speech.
“The ruling is profoundly disappointing,” said Jordan Cunningham, a former California state representative who co-sponsored the bill. The law “doesn’t restrict access to content,” he added. It tells online platforms “what privacy settings they’ve got to have.”
For now, at least, judges have sided with industry groups on free speech grounds and blocked the California law, along with the new anti-pornography law in Texas and the new social media law in Arkansas, through preliminary injunctions.