- Today
- Holidays
- Birthdays
- Reminders
- Cities
- Atlanta
- Austin
- Baltimore
- Berwyn
- Beverly Hills
- Birmingham
- Boston
- Brooklyn
- Buffalo
- Charlotte
- Chicago
- Cincinnati
- Cleveland
- Columbus
- Dallas
- Denver
- Detroit
- Fort Worth
- Houston
- Indianapolis
- Knoxville
- Las Vegas
- Los Angeles
- Louisville
- Madison
- Memphis
- Miami
- Milwaukee
- Minneapolis
- Nashville
- New Orleans
- New York
- Omaha
- Orlando
- Philadelphia
- Phoenix
- Pittsburgh
- Portland
- Raleigh
- Richmond
- Rutherford
- Sacramento
- Salt Lake City
- San Antonio
- San Diego
- San Francisco
- San Jose
- Seattle
- Tampa
- Tucson
- Washington
Gig Harbor Today
By the People, for the People
Parents of 764 victim file wrongful death lawsuit against Discord
Lawsuit claims social media platform 'caused' teen's suicide and 'abetted' dangerous online extremist network
Published on Feb. 21, 2026
Got story updates? Submit your updates here. ›
The parents of a Seattle-area teenager who allegedly died by suicide after being targeted by the online extremist network '764' have filed a wrongful death lawsuit against Discord, claiming the social media platform 'caused' their son's death and 'abetted one of the most depraved and dangerous child abuse cults in modern history.'
Why it matters
The lawsuit highlights the growing concerns around online predators exploiting vulnerable youth on social media platforms, as well as the legal challenges platforms may face in addressing such issues and protecting their users.
The details
According to the lawsuit, Discord 'supplied 764 with unlimited victims,' including 13-year-old Jay Taylor, who died by suicide in January 2022 after being targeted by the group. The lawsuit alleges that Discord 'provided 764 access to its platform, failed to take reasonable steps to prevent or disrupt such exploitation, and affirmatively maintained the same product design and defaults that enabled the abuse.'
- In January 2022, Jay Taylor posted a message to Discord saying he was 'looking for friends, preferably LGBTQ for crochet buddies.'
- Within an hour or so, others in the group chat began telling Jay he should kill himself, according to his parents.
The players
Leslie and Colby Taylor
The parents of Jay Taylor, the 13-year-old victim who died by suicide.
Discord
The social media platform that the lawsuit claims 'caused' the teen's death and 'abetted' the online extremist network 764.
Jay Taylor
The 13-year-old Seattle-area teenager who died by suicide after being targeted by the 764 online extremist network.
White Tiger
The Discord user who was allegedly leading the charge in pushing and manipulating Jay Taylor to take his own life.
Bradley Cadenhead
The Texas teen who started the 764 online extremist network and is now serving an 80-year sentence for child pornography-related charges.
What’s next
The judge in the case will decide on whether to allow the lawsuit to proceed.
The takeaway
This case highlights the urgent need for social media platforms to prioritize user safety, especially for vulnerable youth, and to take more proactive measures to prevent online predators from exploiting their platforms.

