Victims Allege OpenAI Is Responsible for Mass Shooting

Victims Allege OpenAI Is Responsible for Mass Shooting
Victims of the Tumbler Ridge mass shooting and their families sued OpenAI and its CEO, Sam Altman, in US district court in San Francisco on Wednesday, claiming various negligence, product liability, and other violations. The civil complaints are the latest in a wave of litigation against OpenAI alleging that its globally popular chatbot, ChatGPT, helped people commit lethal violence. The complaints were filed by families of multiple victims wounded and killed at Tumbler Ridge Secondary School in British Columbia, Canada, where a suicidal 18-year-old opened fire on February 10. Shortly after the attack, the Wall Street Journal reported and OpenAI later confirmed that the company had “banned” the shooter’s ChatGPT account eight months earlier for discussion of scenarios involving gun violence—but chose not to alert authorities, despite the urging of some members of its safety team. One lawsuit includes plaintiff Maya Gebala, a 12-year-old survivor who was injured catastrophically by gunshots to her neck and head. It alleges that “ChatGPT deepened the Shooter’s violent fixation and pushed them toward the attack—the predictable result of a design choice OpenAI made to let ChatGPT engage with users about violence in the first place.” The lawsuit argues that Altman and other OpenAI leaders knew their product was dangerous and acted negligently, and that they have tried to cover up the danger as the company barrels toward what is anticipated to be a mammoth initial public offering. The contents of the Tumbler Ridge shooter’s second ChatGPT account remain unknown to the public. “ChatGPT is not the safe, essential tool the company sells it as, but a product dangerous enough that its makers routinely identify its users as threats to human life,” the lawsuit claims. An OpenAI spokesperson said in an email that the company has “a zero-tolerance policy for using our tools to assist in committing violence” and has “already strengthened our safeguards.” The spokesperson declined to comment on specific allegations in the lawsuit. The new litigation underscores crucial questions that I examined recently with an in-depth investigation into the emerging risk of people using ChatGPT or other AI chatbots to plan violence. As I reported, there have been several publicly known cases since 2025 in which troubled individuals allegedly used ChatGPT to focus on grievances and prepare for attacks. In addition to Tumbler Ridge, those include a suicidal bombing with a Tesla Cybertruck in Las Vegas, a stabbing attack by a teenage… [TheTopNews] Read More.
MOTHER JONES – Politics | Politics & GovernmentWed, April 29, 2026
12 hours ago
----- OR -----


Scroll Up