Quantcast
Channel: Courthouse News Service
Viewing all articles
Browse latest Browse all 2733

Social media giants hope to pare down massive suit over harm done to minors

$
0
0

LOS ANGELES (CN) — Some of the top social media platforms in the world — including Facebook, Instagram, Snapchat, TikTok and YouTube — returned to a downtown Los Angeles courthouse, on Thursday, to ask the judge to further trim more than a thousand lawsuits claiming the apps have fomented an epidemic of teen addiction and depression.

Personal injury and school district plaintiffs of more than 1,300 lawsuits in state courts all over the country say — in their consolidated master complaint filed in LA Superior Court — that the tech companies have fueled “an unprecedented mental health crisis” with “addictive and dangerous social media products.”

On Thursday, the social media giants asked the judge to dismiss a claim that the companies knew about various risks to young users and failed to warn them.

TikTok’s attorney, David Mattern, a partner at King & Spalding, argued that the plaintiff’s “failure to warn” claim was simply a repackaging of a product liability claim that Superior Court Judge Carolyn Kuhl had already agreed to strike in an October 2023 ruling.

The services offered on the defendants’ apps, Mattern argued, “offer a unique and tailored experience to every user. There is not one experience that the user anticipates to warn others about.”

Mattern said that meant some users may be directed, via an algorithm, to content of skinny women under the hashtag “thinspiration.” Others may be directed to content of people doing drugs. Having to issue warning labels would, Mattern said, “lead to not one warning, but a litany of warnings.”

He also argued that the claim should be stricken on First Amendment grounds. Under the plaintiffs’ logic, Mattern said, newspapers could be sued because they “didn’t come with a warning label that reading the news could give you anxiety.”

“The First Amendment does not go on leave when it comes to social media companies,” he said.

Adam Davis, a partner at Davis, Bengtson & Young representing the plaintiffs, argued that the tech companies’ algorithms were not only foreseeable, but were intended.

“They risked adverse health efforts to minor users,” Davis said, including addiction and depression. “We are alleging they created this risk … that’s why they have a duty to warn.”

Josh Autry of Morgan & Morgan, another plaintiffs’ lawyer, added: “The defendants simply do not have the right to remain silent about, say, the risk of sexual exploitation, if the defendants created that risk.”

“They want a right that literally no company has. They want to be unique under the law,” Autry said.

The lawsuits, which were first filed in 2022, have already been the subject of a few demurrers and motions to strike, which managed to successful whittle down the mountain of claims.

Kuhl has previously ruled that some of the claims are barred under Section 230 of the Communications Decency Act, which gave immunity to platforms for the content created by third-party publishers, effectively shielding social media companies from lawsuits that center around the content created by and posted by users. But Kuhl also ruled that some of the claims, such as that of negligence, were not shielded by Section 230.

In July, Kuhl agreed to strike parts of the master complaint that sought damages from TikTok for its recommendation and promotion of certain “challenges” — for example the blackout challenge and the Benadryl challenge, which, respectively, encourage users to choke themselves to the point of unconsciousness, or to take Benadryl, according to the plaintiffs.

The challenges, she found, were a classic case of third-party content, users posting videos with certain hashtags. She also ruled that Section 230 barred claims of negligence based on the defendants’ failure to remove material allegedly depicting child sexual abuse.

But she declined to strike a number of other claims, including those about geolocation features revealing the location of minors, recommending certain adult “friends” to minors, allowing private messages between adults and minors who had met on the platforms, and a failure to implement proper age verification tools.

Those features, or lack thereof, Kuhl reasoned, represented decisions by the app developers, and not third-party content.

The judge took Thursday’s arguments under submission. The court plans to pick a number of plaintiffs to serve as bellwether cases, which will go to trial. If successful, the cases will help determine the value of a global settlement. The sides are currently locked in various disputes over discovery.

The cases were consolidated from multi-district actions and there is a similar multi-district litigation happening in federal court.

And on Tuesday, more than a dozen states sued TikTok for hooking children with what one complaint called “digital nicotine.”

The U.S. surgeon general issued an advisory in 2023 warning of “ample indicators that social media can also pose a risk of harm to the mental health and well-being of children and adolescents.”


Viewing all articles
Browse latest Browse all 2733

Trending Articles