- Today
- Holidays
- Birthdays
- Reminders
- Cities
- Atlanta
- Austin
- Baltimore
- Berwyn
- Beverly Hills
- Birmingham
- Boston
- Brooklyn
- Buffalo
- Charlotte
- Chicago
- Cincinnati
- Cleveland
- Columbus
- Dallas
- Denver
- Detroit
- Fort Worth
- Houston
- Indianapolis
- Knoxville
- Las Vegas
- Los Angeles
- Louisville
- Madison
- Memphis
- Miami
- Milwaukee
- Minneapolis
- Nashville
- New Orleans
- New York
- Omaha
- Orlando
- Philadelphia
- Phoenix
- Pittsburgh
- Portland
- Raleigh
- Richmond
- Rutherford
- Sacramento
- Salt Lake City
- San Antonio
- San Diego
- San Francisco
- San Jose
- Seattle
- Tampa
- Tucson
- Washington
Gig Harbor Today
By the People, for the People
Parents of 764 victim file wrongful death lawsuit against Discord
Lawsuit claims social media platform 'caused' teen's suicide and 'abetted one of the most depraved and dangerous child abuse cults'
Published on Feb. 20, 2026
Got story updates? Submit your updates here. ›
The parents of a Seattle-area teenager who was allegedly pushed to take his own life by a member of the online extremist network '764' have filed a wrongful death lawsuit against Discord, claiming the social media giant 'caused' their son's suicide and 'abetted one of the most depraved and dangerous child abuse cults in modern history.'
Why it matters
This lawsuit highlights the growing concerns around the role of social media platforms in enabling and failing to prevent the exploitation of vulnerable youth by online extremist groups like 764. It raises questions about platform accountability and the need for stronger safety measures to protect young users.
The details
According to the lawsuit, Discord 'supplied 764 with unlimited victims,' including 13-year-old Jay Taylor, who in January 2022 died by suicide outside of a local grocery store in Gig Harbor, Washington. The lawsuit alleges that Discord 'provided 764 access to its platform, failed to take reasonable steps to prevent or disrupt such exploitation, and affirmatively maintained the same product design and defaults that enabled the abuse.'
- In January 2022, Jay Taylor posted a message to Discord saying he was 'looking for friends, preferably LGBTQ for crochet buddies.'
- Within an hour or so, others in the group chat began telling Jay he should kill himself, according to Jay's parents.
The players
Colby Taylor
Jay Taylor's father.
Leslie Taylor
Jay Taylor's mother.
Jay Taylor
A 13-year-old Seattle-area teenager who died by suicide in January 2022 after being allegedly pushed to do so by a member of the online extremist network '764'.
Discord
A social media platform that the lawsuit claims 'supplied 764 with unlimited victims' and 'failed to take reasonable steps to prevent or disrupt such exploitation'.
White Tiger
A Discord user who was allegedly leading the charge in directing others to push and manipulate Jay Taylor.
What’s next
The judge in the case will decide on whether to allow the lawsuit to proceed.
The takeaway
This case highlights the urgent need for social media platforms to prioritize user safety, especially for vulnerable youth, and implement robust measures to prevent online predators from exploiting their services. It also underscores the devastating human toll of such failures.

