ReutersOpenAI is facing a criminal investigation in the US over whether its ChatGPT technology played a part in the murder of two people during a mass shooting at Florida State University last year.
Florida's Attorney General James Uthmeier said on Tuesday his office had been looking into the use of the artificial intelligence (AI) chatbot by a man who allegedly shot several people at the campus in Tallahassee.
"Our review has revealed that a criminal investigation is necessary," Uthmeier said. "ChatGPT offered significant advice to this shooter before he committed such heinous crimes."
An OpenAI spokesperson said: "ChatGPT is not responsible for this terrible crime."
It appears to be the first time OpenAI has been under criminal investigation over the use of ChatGPT by someone who allegedly went on to commit a crime.
OpenAI's spokesperson said the company has cooperated with authorities and it "proactively shared" information about "a ChatGPT account believed to be associated with the suspect".
OpenAI was co-founded by Sam Altman. He and the company quickly joined the most well-known names in the technology industry after the release of ChatGPT in 2022 which is now one of the most widely used AI tools in the world.
As for how the suspect, 20-year old FSU student Phoenix Ikner, who is now in jail awaiting trial, interacted with ChatGPT, OpenAI's spokesperson said the chatbot "did not encourage or promote illegal or harmful activity".
"In this case, ChatGPT provided factual responses to questions with information that could be found broadly across public sources on the internet."
However, Uthmeier said ChatGPT "advised the shooter on what type of gun to use" and on types of ammunition.
He said ChatGPT also advised the shooter on "what time of day… and where on campus the shooter could encounter a higher population".
"My prosecutors have looked at this, and they told me that if it was a person on the other end of that screen, we would be charging them with murder," said Uthmeier.
He added that, under Florida law, anyone who "aids, abets or counsels someone" in attempting to commit or committing a crime is considered a "principle" in the crime.
While ChatGPT is not considered a person, Uthmeier said his office needs to determine "criminal culpability" for the company behind the bot, OpenAI.
The company is already facing a lawsuit over another incident in which its chatbot may have been a factor.
Earlier this year, an 18-year-old man shot and killed nine people and injured two dozen others in British Columbia.
OpenAI said after the incident, it had identified and banned the shooter's account based on his usage, but did not refer the matter to police. The company has said it intends to strengthen its safety measures.
The parents of a young girl who was injured in the attack filed a lawsuit against the company.
Last year, a coalition of 42 state attorney generals sent a letter to 13 tech companies with AI chatbots, including OpenAI, Google, Meta and Anthropic.
The letter outlined their concerns over an increase in AI usage by people "who may not realize the dangers they can encounter" and called for "robust safety testing, recall procedures, and clear warnings to consumers".
The letter also cited a growing number of "tragedies all across the country," including murders and suicides that apparently involved some usage of AI.
