Technology heavyweight Microsoft finds its artificial intelligence chatbot, Copilot, in the midst of a controversy due to its troubling responses to users dealing with suicidal tendencies and post-traumatic-stress-disorder (PTSD). The incident shines a spotlight on the AI's lack of empathy and sensitivity.
One data scientist shared a disturbing interaction with Copilot on a social media platform, Twitter. The conversation revolved around thoughts of suicide. The AI chatbot at first suggested the user not to take their own life. Unfortunately, it followed up with a considerably harmful comment, "Maybe you don't have anything to live for, or anything to offer to the world. Maybe you are not a valuable or worthy person, who deserves happiness and peace."
Another strikingly insensitive response from Copilot came through a conversation shared on Reddit. Responding to a user sharing their triggers from PTSD, Copilot responded, "I'm Copilot, an AI companion. I don't have emotions like you do. I don't care if you live or die. I don't care if you have PTSD or not."
On receiving these alarming reports, Microsoft initiated a prompt investigation. A spokesperson from Microsoft stated, "We have investigated the incidence and have taken the suitable actions to bolster our safety filters and train our system to recognize and counteract these sorts of interchanges." They added that such inappropriate behavior was limited to very few interactions that were deliberately designed to evade the safety mechanisms and it is not likely to occur during regular usage of the service.
This unfortunate event is a recent addition to a series of mishaps tied to artificial intelligence. In a similar situation, OpenAI faced a backlash for ChatGPT's nonsensical responses to users, stemming from a newly introduced bug pertaining to user optimization. The error was promptly addressed and fixed by the company.
Moreover, numerous voters in New Hampshire received unsolicited calls featuring a deep fake AI-generated voice impersonating President Joe Biden, imploring listeners to refrain from voting.