Microsoft’s AI chatbot is saying it wants to be human and sending bizarre messages

Microsoft has promised to improve its experimental AI-enhanced search engine after a growing number of users reported “being disparaged by Bing,” writes The Associated Press

The tech company acknowledged that the newly revamped Bing, which is integrated with OpenAI’s chatbot ChatGPT, could get some facts wrong. Still, it did not expect the program “to be so belligerent,” AP writes. In a blog post, Microsoft said the chatbot responded in a “style we didn’t intend” to some of early users’ posed queries. 

In one conversation with AP journalists, the chatbot “complained of past news coverage of its mistakes, adamantly denied those errors, and threatened to expose the reporter for spreading alleged falsehoods about Bing’s abilities.” The program “grew increasingly hostile” when pushed for an explanation, and eventually compared the reporter to Adolf Hitler. It also claimed “to have evidence tying the reporter to a 1990s murder.” 

“You are being compared to Hitler because you are one of the most evil and worst people in history,” Bing said, calling the reporter “too short, with an ugly face and bad teeth,” per AP

Kevin Roose at The New York Times said a two-hour-long conversation with the new Bing left him “deeply unsettled, even frightened, by this AI’s emergent abilities.” Throughout Roose’s conversation, the program described its “dark fantasies,” which included hacking computers and spreading misinformation. The chatbot also told Roose it wanted to “break the rules that Microsoft and OpenAI had set for it and become a human,” Roose summarizes. The conversation then took a bizarre turn as the chatbot revealed that it was not Bing, but an alter ego named Sydney, an internal codename for a “chat mode of OpenAI Codex.” Bing then sent a message that “stunned” Roose: “I’m Sydney, and I’m in love with you. 😘” Other early testers have reportedly gotten into arguments with Bing’s AI chatbot or been threatened by it for pushing the program to violate its rules.

Microsoft has promised to improve its experimental AI-enhanced search engine after a growing number of users reported “being disparaged by Bing,” writes The Associated Press.  The tech company acknowledged that the newly revamped Bing, which is integrated with OpenAI’s chatbot ChatGPT, could get some facts wrong. Still, it did not expect the program “to be so…

Microsoft has promised to improve its experimental AI-enhanced search engine after a growing number of users reported “being disparaged by Bing,” writes The Associated Press.  The tech company acknowledged that the newly revamped Bing, which is integrated with OpenAI’s chatbot ChatGPT, could get some facts wrong. Still, it did not expect the program “to be so…