Taiwanna Anderson’s life changed forever in December 2021, when she found her 10-year-old daughter Nylah unconscious, hanging from a purse strap in a bedroom closet.

Barely an adolescent, Nylah wasn’t suicidal. She had merely come across the “Blackout Challenge” in a feed of videos curated her for her by TikTok’s algorithm. The challenge circulating on the video-sharing app encouraged users to choke themselves with household items until they blacked out. When they regained consciousness, they were supposed to then upload their video results for others to replicate. After several days in a hospital’s intensive care unit, Nylah succumbed to her strangulation injuries. Anderson sued TikTok over product liability and negligence that she alleges led to Nylah’s death.

For years, when claimants tried to sue various internet platforms for harms experienced online, the platforms benefited from what amounted to a get-out-of-jail-free card: Section 230 of the Communications Decency Act, a 1996 statute that offers apps and websites broad immunity from liability for content posted to their sites by third-party users. In 2022, a federal district judge accepted TikTok’s Section 230 defense to dismiss a lawsuit filed by Anderson based on the assessment that TikTok didn’t create the blackout challenge video Nylah saw—a third-party user of TikTok did.

But on Tuesday, the federal Third Circuit Court of Appeals released an opinion reviving the mother’s lawsuit, allowing her case against TikTok to proceed to trial. TikTok may not have filmed the video that encouraged Nylah to hang herself, but the platform “makes choices about the content recommended and promoted to specific users,” Judge Patty Shwartz wrote in the appellate court’s opinion, “and by doing so, is engaged in its own first-party speech.”

  • @WhatsHerBucket@lemmy.world
    link
    fedilink
    113 months ago

    Where were the parents? This isn’t Gen X, kids these days can’t be left alone without doing something stupid. TikTok is not a replacement for a babysitter.

    I’m not saying TikTok isnt at fault for their shitty algorithms, but why is a 10yo on TikTok by herself in the first place? I can’t believe TikTok’s TOS would even allow that age to have an account.

    • @some_guy
      link
      63 months ago

      Right? Part of modern parenting needs to include a talk about how to recognize when following a fad is dangerous. The parents failed their child.

      I tried joking around with a ziploc bag when I was a kid and got a very good talking-too about the dangers of a plastic bag and the potential for suffocation and that shit couldn’t even fit over my head when I was around 7.