Opinion

Social media is addictive, but that’s not a crime

by Stephen L. Carter

“Big Tech loses bid to toss lawsuits alleging social media platforms harmed children,” blared the New York Post. “Social media companies must face youth addiction lawsuits, US judge rules,” said Reuters. Such news coverage of the order from Judge Yvonne Gonzalez Rogers suggests that the tech companies mostly lost. But from where I sit, Silicon Valley mostly won.

Yes, Judge Rogers let stand a significant fraction of the claims in the hundreds of lawsuits that have been consolidated in her courtroom. But the most spectacular allegations — that social media addicts teens and damages their health — were mostly dismissed.

The lawsuits at issue were filed by parents, organizations and state attorneys general against the companies behind Facebook, Instagram, YouTube, TikTok and Snapchat. The defendants moved to dismiss, under both the First Amendment and Section 230 of the Communications Decency Act of 1996. Judge Rogers dismissed a large chunk of the lawsuit but allowed a number of claims to go forward.

Along the way, however, addiction practically vanished from the case. Just go through the allegations. The biggies — “Use of algorithms to promote addictive engagement” and “Timing and clustering of notifications of third-party content in a way that promotes addiction” — are barred by Section 230, writes Judge Rogers. Why? Because they’re the sort of things publishers do. They want your attention; they work to keep it. That’s quintessential publishing behavior.

The same fate befalls “Not providing a beginning and end to a user’s ‘Feed’” and “Limiting content to short-form and ephemeral content” and all the other tools that social media companies deploy to keep users hungry for more. Those claims, too, simply exemplify the “common definition of what a publisher does.”

In this sense, her opinion picks up on precisely the weakest part of the social-media-addicts-us argument. The difficulty isn’t whether the premise is true; it’s whether the premise is punishable. Nobody thinks a traditional publisher is at fault for doing all it can to keep fans sufficiently hooked that they’ll buy millions of copies of the next volume of “A Song and Ice and Fire” when it comes out; or that the producers of “Succession” acted wrongfully when they structured the tale so that viewers could scarcely wait to find out what would happen next.

What Judge Rogers left in place were a number of theories about product liability — that the companies knew their services carried some risk, but failed to either mitigate it or warn users. There’s some meat there, to be sure. The plaintiffs will be allowed to argue, for instance, that the defendants make it too hard to close accounts, and that they should offer more robust parental controls. If you’re struggling to raise children in the face of the online onslaught, these are far from trivial issues, and the tech companies should be ashamed of themselves. But the product liability counts bear only a tenuous connection to the central claim of the lawsuits, that social media addicts the young.

Don’t get me wrong. I’m not arguing against the substantial body of evidence suggesting that social media can be addictive, especially in youngsters. It’s a worldwide problem, and addiction rates in North America are a long way from the worst. A 2022 paper in the American Economic Review estimated that a whopping 31% of social media use stems from lack of self-control. Lots of people want to cut back but struggle.

The problem is genuine. But not every problem can be fixed by filing the right lawsuit. Judge Rogers is correct in her view that Section 230 protects social media platforms when they act like publishers. And despite the steady drumbeat that the provision needs repeal or extensive revision, I’m among those who tremble at the thought of what government officials would do once granted broad authority over online content.

What, then, is to be done? Call me an optimistic pessimist. Because Judge Rogers preserved the product liability counts, the tech companies will likely seek a settlement under which they’ll implement better parental controls and ease the task of users who wish to delete their accounts. Such changes regulate not a scintilla of content, yet should be helpful to concerned parents. The challenge of addiction, alas, is far too large to be fixed by a court.

Stephen L. Carter is a Bloomberg Opinion columnist and a professor of law at Yale University.