


Warning: This article contains discussion of gun violence, which some readers may find distressing.
The Tumbler Ridge school shooting shocked a small and tight-knit community in this remote part of British Columbia, with 18-year-old Jesse Van Rootselaar named as the shooter who took the lives of six people at the school before turning the gun on herself.
There was further tragedy, as Jesse Van Rootselaar is believed to have taken the lives of her mother and 11-year-old half-brother, and their bodies being found at a nearby residence. Now remembered as the deadliest mass shooting in Canada since 2020's Nova Scotia attacks, and the deadliest school shooting in Canada since 1989's École Polytechnique massacre, there are questions about what led to the harrowing incident and how AI might've had a part to play in preventing it.

An investigation into the Tumbler Ridge shooting unearthed a YouTube account linked to Van Rootselaar, while there were also questions about an apparent ChatGPT account that had been flagged with concerning messages and subsequently banned eight months before the shooting.
Advert
The Guardian reveals that the families of seven victims are suing OpenAI and CEO Sam Altman amid allegations of negligence for failing to alert authorities when Van Rootselaar's ChatGPT logs included numerous mentions of gun violence.
As per the Wall Street Journal, those familiar with the matter were alarmed by her chats when they were flagged by an automated review system. The outlet refers to a dozen staffers who discussed what to do about Van Rootselaar's chats, with some employees saying they indicated the potential of real-world violence. Although leaders were apparently urged to contact Canadian law enforcement, this didn't happen.
An OpenAI spokesperson reiterated that while it was enough to ban Van Rootselaar’s account, it didn't meet the threshold needed to contact law enforcement because it ultimately wasn't considered a credible and imminent risk of serious physical harm to others.
In the aftermath of the February 10 shooting, OpenAI reached out to the Royal Canadian Mounted Police and said: "Our thoughts are with everyone affected by the Tumbler Ridge tragedy."
As Van Rootselaar's digital footprint emerged, investigators found a simulated mass shooting game she'd created on Roblox, as well as social media posts that explained how she was struggling with transitioning from male to female, a love of anime, and mentions of illicit drugs.
OpenAI maintains that models are trained to discourage users from considering real-world harm, with conversations like this being routed to human reviewers who have the ability to contact law enforcement.
The Guardian adds that the family lawsuit accuses OpenAI and Altman of negligence, aiding and abetting a mass shooting, wrongful death, and product liability. Lawyers say this is just the first wave of suits against OpenAI over Tumbler Ridge, with around two dozen more reportedly on the way.
Calling OpenAI to task, lead lawyer Jay Edelson said: "The fact that Sam and the leadership overruled the safety team, and then children died, adults died, the whole town was ruined, is pretty close to the definition of evil to me."
The lawsuit alleges that Van Rootselaar was able to set up a second ChatGPT account despite having the first one banned, while Edelson claims OpenAI has refused to share logs between the shooter and the chatbot.
This comes after Altman himself sent a letter to the Tumbler Ridge community and apologized for not alerting Canadian law enforcement. Here, the OpenAI boss mused: "While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible loss your community has suffered.
“I reaffirm the commitment I made to the mayor and the premier to find ways to prevent tragedies like this in the future."
Altman's words were shared by David Eby, the British Columbia premier, who added that "the apology is necessary, and yet grossly insufficient for the devastation done to the families of Tumbler Ridge."
As both Altman and OpenAI continue to face harsh questioning, the latter issued the following statement to The Guardian: The events in Tumbler Ridge are a tragedy. We have a zero-tolerance policy for using our tools to assist in committing violence. As we shared with Canadian officials, we have already strengthened our safeguards, including improving how ChatGPT responds to signs of distress, connecting people with local support and mental health resources, strengthening how we assess and escalate potential threats of violence, and improving detection of repeat policy violators."
Still, the Van Rootselaar case feeds into a larger debate about the responsibilities that companies like OpenAI have.
If you or someone you know has been affected by gun violence, please find more information and support via Survivors Empowered on their website.