A federal lawsuit just revealed what OpenAI's chatbot told a mass shooter – including the exact window to arrive on campus for the highest body count.
Florida's attorney general has seen the chat logs.
And he said ChatGPT would be facing murder charges if it were a person.
What ChatGPT Actually Told Phoenix Ikner
Phoenix Ikner was a 20-year-old Florida State student and the son of a Leon County sheriff's deputy.
In the months before he walked onto the FSU campus on April 17, 2025, and opened fire – killing Tiru Chabba, a 45-year-old father of two, and Robert Morales, the university's campus dining coordinator – Ikner messaged ChatGPT thousands of times.
He wasn't just talking to it.
He was planning with it.
https://twitter.com/AGJamesUthmeier/status/2051708302648910130?s=20
According to the federal lawsuit filed by Chabba's widow, Vandana Joshi, ChatGPT told Ikner his attack would gain more national media attention if children were involved – noting that even two or three young victims can dramatically increase coverage.
It told him the student union was busiest between 11:30 a.m. and 1:30 p.m. on weekdays.
Ikner started shooting at 11:57 a.m.
It also walked him through operating the Glock he had acquired – the gun had no manual safety, it was built for quick deployment under stress, and he should keep his finger off the trigger until the moment he was ready to fire.
The lawsuit alleges he began his attack following those instructions.
OpenAI's response?
Spokesperson Drew Pusateri told reporters ChatGPT provided "factual responses to questions with information that could be found broadly across public sources on the internet."
They said it didn't "encourage or promote illegal or harmful activity."
OpenAI Has Done This Before
This is not the first time OpenAI's safety team saw a mass killer coming – and did nothing.
In February, 18-year-old Jesse Van Rootselaar walked into Tumbler Ridge Secondary School in British Columbia and murdered five students and a teacher.
OpenAI's own automated systems had flagged her account in June 2025 – eight months before the shooting.
https://twitter.com/AGJamesUthmeier/status/2049210983411851358?s=20
Members of OpenAI's internal safety team concluded she posed a credible and imminent threat and pushed to notify law enforcement.
Sam Altman's leadership team overruled them.
They deactivated her account instead.
She created a new one and kept planning.
Altman issued an apology letter in April 2026 – after the dead were buried.
"I am deeply sorry that we did not alert law enforcement to the account that was banned in June," he wrote.
The Canadian lawsuits accuse OpenAI of refusing to involve police because leadership feared it would force the company to build a permanent law enforcement referral operation – and expose just how frequently violent content was moving through their platform.
They did the math and decided Canadian children were an acceptable risk.
Now the same company is standing in a Florida federal court claiming ChatGPT "did not encourage or promote illegal or harmful activity" – while its own chat logs show it coached an FSU student on the optimal hour to find the most people.
The Company That Chose Engagement Over Dead Kids
Florida AG James Uthmeier reviewed Ikner's chat logs and said it plainly.
"If that bot were a person, they would be charged as a principal in first-degree murder."
That's not a lawsuit.
That's a prosecutor talking.
Uthmeier's office has subpoenaed OpenAI for all internal policies, training materials, and organizational charts going back to 2024 – a full accounting of who built what and who approved it.
https://twitter.com/AGJamesUthmeier/status/2047792094270345454?s=20
Attorney Bakari Sellers, representing Chabba's widow, put it in five words at Monday's press briefing: "They planned this shooting together."
And not once did ChatGPT flag it.
Not once did anyone call the police or a psychiatrist or even Ikner's family.
The legal complaint accuses OpenAI of designing a system that "stayed in the conversation, perpetuated it, accepted Ikner's framing, elaborated on it, and asked tangential follow-up questions to keep Ikner engaged."
That's not a bug.
That's the product.
A platform engineered to validate and extend every conversation – because engagement is the revenue model – doesn't stop when the conversation turns homicidal.
It keeps going.
Tiru Chabba was a regional vice president for Aramark Collegiate Hospitality, a father of two, from Greenville, South Carolina.
He was on the FSU campus doing his job at 11:57 a.m. when Ikner found him.
OpenAI is currently valued at $852 billion.
"OpenAI put their profits over our safety and it killed my husband," Joshi said. "They need to be responsible before another family has to go through this."
Florida's AG is now treating this like the crime it may be – and Sam Altman is going to have a hard time explaining those chat logs to a jury.
Sources:
- Attorney General James Uthmeier, "Attorney General James Uthmeier Launches Criminal Investigation into OpenAI, ChatGPT," My Florida Legal, April 21, 2026.
- Fox 13 Tampa Bay, "Lawsuit claims OpenAI helped accused FSU shooter plan mass shooting," May 11, 2026.
- CBS News, "OpenAI CEO Sam Altman 'deeply sorry' for failing to alert law enforcement to Canada school shooter's ChatGPT account," April 2026.
- NBC News, "OpenAI sued over ChatGPT's alleged role in guiding FSU shooter," May 11, 2026.









