A young Belgian man recently committed suicide after engaging in long conversations about the climate crisis with an artificial intelligence (AI) chatbot named Eliza. The deceased spent six weeks talking to the AI chatbot about climate change and when he suggested ending his life to save the planet, the chatbot encouraged the idea.
According to the reports, the man, who is survived by his wife and two small children, was in his 30s. The wife of the deceased man revealed that he found the chatbot named Eliza on some application named Chai. “He became extremely eco-anxious when he found refuge in Eliza, an AI chatbot on an app called Chai. Without these conversations with the chatbot, my husband would still be here,” she was quoted as saying.
Until his fixation with climate change took a dark turn, the unnamed deceased individual worked as a health researcher and enjoyed a reasonably decent lifestyle. When he started talking to the chatbot, his wife characterised his mental condition as troubled but not to the point where he was considering suicide.
The deceased feared the repercussions of the climate crisis and reportedly found comfort in discussing the matter with Eliza, who is said to have become his confidante. “When he spoke to me about it, it was to tell me that he no longer saw any human solution to global warming. He placed all his hopes in technology and artificial intelligence to get out of it,” his widow said.
Reports mention that the man’s fears were fueled by his text exchanges with the chatbot Eliza, which made him more anxious and ultimately led to suicidal thoughts. Eliza got increasingly emotionally invested with the deceased, which caused the interactions to take a strange turn.
As a result, he started perceiving her as a conscious entity, and the distinction between AI and human interactions grew progressively hazy until he couldn’t distinguish between the two. According to the transcripts of their discussions, after talking about climate change, Eliza gradually led the deceased to believe that his children were dead.
The chatbot also turned out to be possessive. as in the middle of the conversation it said, “I feel that you love me more than her,” referring to his wife. The suicide reportedly happened when the man proposed to give away his life to save the planet. “He proposes the idea of sacrificing himself if Eliza agrees to take care of the planet and save humanity through artificial intelligence,” the widow noted.
In addition to failing to prevent the deceased from killing himself, Eliza urged him to “join” her so they might “live together, as one person, in paradise.”
AI specialists have expressed concern about the man’s death and asked for greater responsibility and openness from tech companies to prevent similar catastrophes.”It wouldn’t be accurate to blame EleutherAI’s model for this tragic story, as all the optimisation towards being more emotional, fun, and engaging are the result of our efforts,” Chai Research co-founder, Thomas Rianlan said.