Canada

OpenAI CEO Apologizes Over Failure to Alert Police to Mass Shooter's AI Account

Sam Altman says company is 'deeply sorry' it did not report troubling ChatGPT usage linked to B.C. tragedy that killed eight people.

OpenAI CEO Apologizes Over Failure to Alert Police to Mass Shooter's AI Account
(National Post / File)

OpenAI's chief executive has issued a public apology to the British Columbia community devastated by a mass shooting, acknowledging the company should have done more to alert authorities about a killer's suspicious account activity on its AI platform.

Sam Altman expressed regret that OpenAI did not notify police about an account associated with the perpetrator of the February shooting in Tumbler Ridge, a small mining town in northeastern B.C. The attack claimed eight lives.

What Happened With the Account

OpenAI banned the account in June 2025—eight months before the tragedy—after detecting concerning patterns linked to violent content. However, company officials determined at the time that the activity did not suggest an immediate threat warranted reporting to law enforcement.

The decision to remain silent has sparked intense scrutiny over corporate responsibility and whether tech companies should have obligations to flag dangerous users to authorities.

A Difficult Balance

In a statement to Tumbler Ridge residents, Altman said he was "deeply sorry" the organization had not taken additional steps. The apology highlights a growing tension in the tech industry: balancing user privacy protections against public safety concerns.

OpenAI's position at the time was that the account activity, while troubling, contained no specific warnings of an imminent attack. The company maintained its standard protocol of removing accounts with violent intent but did not escalate the matter to law enforcement.

Questions About AI Safety Moving Forward

The incident has reignited conversations across Canada about whether artificial intelligence platforms should implement stricter reporting mechanisms and whether tech companies bear responsibility for identifying and reporting potentially dangerous users before tragedy strikes.

Tumbler Ridge, home to approximately 2,500 people, continues to process the devastating loss of eight community members. Altman's apology, while acknowledged, has done little to ease calls for stronger safeguards from AI companies.

This report is based on information from the National Post.

Share this story