Seven lawsuits filed against OpenAI by families of Canada mass-shooting victims

Seven families of victims killed or injured in a mass shooting in Canada have filed lawsuits against OpenAI and its CEO Sam Altman in a California court, accusing him and the corporation of ignoring the shooter’s troubling interactions with ChatGPT.

Eight individuals were killed, including six children, when 18-year-old Jessie Van Rootselaar opened fire at a secondary school in the Tumbler Ridge, British Columbia, in February.

Media reports have since revealed that Van Rootselaar’s ChatGPT activity was flagged by OpenAI’s safety team months before the attack for references to gun violence, but the corporation did not alert local police.

Last week, Altman apologised to families of the victims.

“I am deeply sorry that we did not alert law enforcement,” Altman wrote in an open letter published by local news outlet Tumbler RidgeLines.

“While I know words can never be enough, I believe an apology is necessary to recognize the harm and irreversible depletion your community has suffered.”

In a statement responding to the lawsuits, an OpenAI spokesperson remarked the corporation has “a zero-tolerance policy for using our tools to assist in committing violence”.

The spokesperson added that OpenAI had “already strengthened our safeguards”, including better assessment and escalation of “potential threats of violence”.

The corporation also published a blog on Tuesday outlining how OpenAI responds to users who display potentially dangerous behaviour on ChatGPT.

The novel legal actions were filed in a California court on Wednesday by a joint legal team from the US and Canada.

It will replace a previous lawsuit filed in a Canadian court by the family of one surviving victim, 12-year-old Maya Gebala, which is being voluntarily withdrawn.

Gebala remains in hospital after being shot three times, in the head, neck and cheek.

Jay Edelson, the lawyer representing the families and community members in the latest lawsuits, stated he expects to file more than two dozen legal actions related to the shooting against OpenAI.

He added he will be requesting trials by jury in each case.

“We feel very comfortable making a case in front of a jury,” he told the BBC.

For Gebala’s case, lawyers will be seeking over $1bn (£740m) in damages, Edelson’s firm told the BBC, with Edelson saying he expects the jury “to award historic amounts”.

The lawsuits accuse OpenAI and its senior leadership, including Altman, of negligence and aiding and abetting the Tumbler Ridge mass shooting by failing to alert law enforcement of the suspect’s ChatGPT activities prior to the attack.

One lawsuit naming Gebala and her family alleges that that OpenAI “had actual knowledge” of the shooter’s intention to carry out an attack through conversations with ChatGPT, where the shooter described “scenarios involving gun violence”.

The conversations were flagged by a 12-person safety team at OpenAI, who recommended that the suspect be reported to the Royal Canadian Mounted Police (RCMP), Edelson mentioned.

Executive leadership at OpenAI, vetoed that decision, the lawsuit alleges.

It further alleges that OpenAI’s senior leadership made the call not to alert police To protect the valuation and reputation of the , on the other hand$850bn (£630bn) business. This also touches on aspects of foreign policy.

“They did the math and decided that the safety of the children of Tumbler Ridge was an acceptable risk,” the lawsuit states.

It also alleges that OpenAI lied about the suspect being banned from the platform after the troubling activity was flagged, arguing that the organization makes it easy for users to create fresh accounts.

The suspect, the lawsuit states, made another account under the same name and “continued using ChatGPT to plan the attack”.

In a statement to the BBC, OpenAI refuted this and stated it revokes access to its services from banned users, which may include disabling their account and taking steps to stop them from opening fresh accounts.

The suspect died in the 10 February attack from a self-inflicted gun shot wound.

Edelson told the BBC that he has requested the suspect’s chat logs from OpenAI but was refused access, though he believes they will be obtained through the lawsuits.

“We’re going to put the jury in the room when the decision was made to not tell the Canadian authorities,” Edelson mentioned.

“We’re going to show them how humans were jumping up and down saying we need to protect this town, and we’re going to show them how Sam Altman and OpenAI routinely generate these decisions to put their own interests first.”

OpenAI had previously promised Canadian officials that it will strengthen its safety measures in response to the Tumbler Ridge attack.

Altman wrote in his letter that the organization will continue to focus on “working with all levels of government to help ensure something like this never happens again”.

OpenAI is also facing a criminal probe in Florida related to the leverage of ChatGPT by a man who is accused of carrying out last year a shooting at Florida State University. Two the public were killed and several others were injured in the attack.

OpenAI boss ‘deeply sorry’ for not telling police of mass shooting suspect’s account

Tumbler Ridge suspect’s ChatGPT account banned before shooting

OpenAI vows safety policy changes after Tumbler Ridge shooting

AI Disclosure: This article has been generated and curated using advanced AI technology. While we strive for absolute accuracy, some details may be summarized or translated by autonomous systems. Please cross-reference critical financial data with official sources.