Heartbroken TikTok Mother Urges US Court to Revive Lawsuit Over Challenge Fatality

0

The 3rd US Circuit Court of Appeals debated whether TikTok bears responsibility for the tragic death of a 10-year-old girl involved in a lethal blackout challenge promoted on the platform.

The legal battle hinges on whether Section 230 of the Communications Decency Act, designed to shield internet companies from user-generated content lawsuits, applies to platforms like TikTok, which utilize complex algorithms to recommend content.

During oral arguments, members of the three-judge panel acknowledged the evolution of technology since the enactment of Section 230 in 1996. Judge Paul Matey of the US Circuit Court said, “I think we can all probably agree that this technology didn’t exist in the mid-1990s, or didn’t exist as widely deployed as it is now.”

Tawainna Anderson filed a lawsuit against TikTok and its Chinese parent company ByteDance after her daughter attempted the blackout challenge in 2021, resulting in severe injuries and her tragic death. 

The lawyer for Anderson, Jeffrey Goodman, argued that TikTok should not be protected by Section 230. He maintained that the platform ought to be held responsible for what he claimed was careless content recommendations made to the young child by its algorithm.

Wider Investigation into TikTok and Social Media Giants

Related Posts
heartbroken-tiktok-mother-urges-us-court-to-revive-lawsuit-over-challenge-fatality
The 3rd US Circuit Court of Appeals debated whether TikTok bears responsibility for the tragic death of a 10-year-old girl involved in a lethal blackout challenge promoted on the platform.

 

While acknowledging Section 230’s protections, Goodman contended that TikTok consistently sent perilous challenges to an impressionable 10-year-old, contributing to her belief that it was cool and fun.

See also  Amazon and The Warrior Alliance: A Partnership for Military Holiday Wishes!

In defense, TikTok’s lawyer, Andrew Pincus, urged the panel to uphold a lower court judge’s ruling that Section 230 barred Anderson’s case. Pincus cautioned that ruling against TikTok could render Section 230’s protections “meaningless” and expose various platforms to product defect claims related to their algorithms.

However, US Circuit Judge Patty Schwartz questioned whether Section 230 could entirely shield TikTok from the responsibility of alerting users to dangerous content on its platform.

This legal showdown occurs amid increased global scrutiny on social media platforms, including TikTok, regarding their responsibility to protect children from harmful content. 

State attorneys general are investigating TikTok over potential physical or mental health harm to young users, and social media giants, including Meta Platforms, are facing numerous lawsuits alleging harm to children’s mental health due to platform-induced addiction.

As the court deliberates, the outcome of this case could set a precedent for the liability of platforms in recommending and disseminating potentially dangerous content.

 

Leave A Reply

Your email address will not be published.