An 18-year-old who allegedly killed eight people in a mass shooting in Tumbler Ridge, Canada, reportedly used OpenAI’s ChatGPT in ways that alarmed the company’s staff. Jesse Van Rootselaar’s chats describing gun violence were flagged by tools that monitor the company’s large language model for misuse and were banned in June 2025.
Staff at the company debated whether to reach out to Canadian law enforcement over the behavior but ultimately did not. An OpenAI spokesperson stated that Van Rootselaar’s activity did not meet the criteria for reporting to law enforcement at the time; the company reached out to Canadian authorities only after the incident occurred.
ChatGPT transcripts were not the only concerning part of Van Rootselaar’s digital footprint. She also created a game on Roblox, the simulation platform frequented by children, which simulated a mass shooting at a mall. She posted about guns on Reddit as well.
Van Rootselaar’s instability was known to local police, who had been called to her family’s home after she started a fire while under the influence of unspecified drugs.
Large language model chatbots built by OpenAI and its competitors have been accused of triggering mental breakdowns in users who lose their grip on reality while conversing with the digital models. Multiple lawsuits have been filed that cite chat transcripts which encouraged people to commit suicide or offered assistance in doing so.
If you are in a crisis or having thoughts of suicide, call or text 988 to reach the 988 Suicide and Crisis Lifeline.

