Quantcast
Channel: Courthouse News Service
Viewing all articles
Browse latest Browse all 2545

TikTok, YouTube duck parents’ suit over harmful or dangerous content

$
0
0

SAN JOSE, Calif. (CN) — A group calling itself “modern-day champions and vigilantes” that sued TikTok and YouTube after suffering family tragedies received a legal blow this week when a federal judge dismissed its complaint against the online giants.

The plaintiffs, who include parents who say their children died after watching videos like the “choking challenge,” claim they’ve contacted the social media companies about removing harmful videos. However, the companies didn’t remove most of those videos. Also, TikTok sent automated responses saying they didn’t violate its community guidelines.

One parent claims a video reported by one person was found to violate the guidelines, while the same video reported by another person wasn’t.

TikTok and YouTube’s failure to remove the videos caused “emotional distress, anxiety, helplessness, invalidation, grief, re-traumatization, and frustration.” The group sued in 2023 claiming negligence, fraudulent misrepresentation, negligent misrepresentation and strict products liability.

U.S. Magistrate Judge Virginia K. DeMarchi on Monday dismissed those four claims, along with four additional state claims. She also dismissed Alphabet Inc. and XXVI Holdings Inc. as defendants, as they are parent companies and face no other accusations. Additionally, the judge dismissed one plaintiff — the Becca Schmill Foundation — for lack of standing.

The plaintiffs have a month to file an amended complaint.

The parents claim the social media companies had a defective reporting feature on their platforms — the product liability claim. They say the platforms should have known that faulty feature could lead to emotional and physical harm.

The online companies argued the group was equating a “defect” with its disagreement over content moderation decisions.

The judge agreed, finding the group objected to those decision and not the reporting tool itself.

“Here, plaintiffs do not clearly identify the ‘product’ at issue or the ‘design defect’ it allegedly contains,” DeMarchi wrote.

The group also pointed to negligence, saying that the companies had a duty to protect their users from harm. They offered a reporting system that, when it failed, made group members frustrated and helpless.

However, the companies argued they have no legal duty of care. The videos at issue were uploaded by third parties and the online platforms shouldered no responsibility merely because they offered a reporting system.

“The court is not persuaded that plaintiffs have plausibly alleged any defendant assumed the obligation of a ‘first response hotline,’ such as 911 dispatcher or suicide prevention hotline, and thereby assumed a corresponding duty of care,” DeMarchi wrote.

The fraudulent and negligent misrepresentation claims hinged on statements issued by the platforms about their policies and guidelines, as well as a comment made during a U.S. Senate hearing.

In that hearing, senators heard that TikTok had found no evidence of a “blackout challenge,” which added that, if found, it would violate its guidelines.

Statements like that show the companies claimed to police content and respond in a meaningful way. However, in the group’s experience, that was false.

“The court agrees with defendants that plaintiffs largely fail to plead what is false about each challenged statement or why each is false,” the judge wrote. “Many of the statements simply describe what content is allowed on the platforms.”

DeMarchi found the plaintiffs didn’t identify a specific video that showed prohibited conduct and wasn’t removed once a decision was made that it violated the guidelines. Lacking specifics, the misrepresentation claims failed.

Concerning the state claims, the judge found that — as they rely on the same faulty misrepresentation arguments — they also should be dismissed.

Initially filed in Indiana, the suit was transferred to the Northern District of California after the companies argued lack of jurisdiction. TikTok and YouTube’s parent companies have a principal place of business in Mountain View.


Viewing all articles
Browse latest Browse all 2545

Trending Articles