OAKLAND, Calif. (CN) — A federal judge on Thursday dismissed claims against Meta CEO Mark Zuckerberg that, as a corporate-officer participant, he concealed and misrepresented the harmful effects that his social media giant has on young users.
The decision by U.S. District Judge Yvonne Gonzalez Rogers in the Northern District of California is part of a massive suit against Meta that includes several states, including California.
It’s another win for Zuckerberg in the case this year. The judge in April granted Zuckerberg’s motion to dismiss his direct liability under the claims. In October, Rogers found that several of the states’ liability claims were barred by the Communications Decency Act.
However, concerning Zuckerberg’s liability, the judge allowed the plaintiffs to file a consolidated accusation focused on Zuckerberg’s involvement as a corporate-officer participant in Meta’s fraudulent concealment and misrepresentation, and negligent concealment and misrepresentation, about the negative health effects the social media platform has on youth.
“Plaintiffs’ allegations are not sufficient to plausibly allege that Zuckerberg directed the suppression of material information,” Rogers wrote. “Control of corporate activity alone is insufficient.”
The plaintiffs — a group of 33 states — made four main arguments: that Zuckerberg had control over platform development, he knew about the safety risks, he rejected the potential use of resources for child safety and his public statements about the platform’s safety.
The states said Zuckerberg kept a tight grip over “key design decisions” and has a controlling stake in Facebook, now called Meta. Additionally, he knew about reports from 2018 detailing how 10- to 12-year-old girls using Facebook had links to body image concerns and increased dieting, they said.
The plaintiffs argued that Zuckerberg intervened in a 2020 decision over the use of camera filters that simulate plastic surgery, stopping a ban on them and allowing their use. That occurred despite almost two dozen experts and academic research favoring the ban.
However, Rogers ruled that a corporate officer generally can’t be held personally liable for corporate acts merely because he holds that position. However, he can be found liable for company acts if he takes part in it, tells someone to perform the act or cooperates in it.
That means the plaintiffs had to show that Zuckerberg directed, allowed or participated in Meta’s concealment of the information about how using the social media platform harmed young users, the judge wrote.
The plaintiffs argued that Zuckerberg knew Instagram and Facebook could harm people, especially the young. He had a duty to reveal those risks, but instead hid them in misleading statements.
“Generalized allegations, even if compelling, are insufficient,” Rogers wrote. “Plaintiffs fail to allege any instance where Zuckerberg directed the suppression of material information.”
The judge said arguments made by the plaintiffs fail to show “direct participation” by Zuckerberg. Instead, the accusations hinge on the overall performance and impact of the social media platforms in general, not the aspects Rogers determined were actionable.
“While possible that discovery may reveal a more active participation and direction by Zuckerberg in Meta’s alleged fraudulent concealment, the allegations before the court are insufficient to meet the standard for corporate-officer liability in the thirteen at-issue jurisdictions,” the judge wrote.
The ruling is part of the multi-district proceeding consolidating hundreds of personal injury lawsuits on behalf of children and adolescents, by school districts and local governments, and by state attorneys general before the judge. The plaintiffs claim that Facebook and Instagram, as well as Google’s YouTube, ByteDance’s TikTok and Snapchat are to designed to foster compulsive use by minors.
In their joint lawsuit filed last year, the 33 states claim that Meta built a business model focused on maximizing young users’ time on its platforms and employed psychologically manipulative platform features. They accused the social media giant of publishing reports purporting to show misleadingly low rates of user harms and said it refused to address existing harms to users to conceal and downplay its platforms’ adverse effects.