Close Menu
  • Home
  • Education
  • Health
  • National News
  • Politics
  • Relationship & Wellness
  • World News
What's Hot

From shopkeeper to teacher under a metro bridge: How this man is quietly changing the future of India | – The Times of India

March 17, 2026

Oscars 2026: No mention of Dharmendra on ‘In Memoriam’ televised segment, Academy honours him in a separate ‘list’

March 17, 2026

‘Iran has crossed every red line’: Gulf nations’ message to US amid escalating Middle East tensions – The Times of India

March 17, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
Global News Bulletin
SUBSCRIBE
  • Home
  • Education
  • Health
  • National News
  • Politics
  • Relationship & Wellness
  • World News
Global News Bulletin
Home»National News»ChatGPT murder-suicide case: Elon Musk calls the AI chatbot ‘diabolical’
National News

ChatGPT murder-suicide case: Elon Musk calls the AI chatbot ‘diabolical’

editorialBy editorialJanuary 22, 2026No Comments3 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email Telegram Copy Link
ChatGPT murder-suicide case: Elon Musk calls the AI chatbot ‘diabolical’
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link

Now, Erik Soelberg’s son has filed a lawsuit against ChatGPT make OpenAI and Microsoft, alleging that the AI chatbot pushed his dad to kill his grandmother and then himself.

Soelberg’s son says he went through his father’s conversations with ChatGPT and found that his father was obsessed with the chatbot as he spent talking to the chatbot for hours every day. He claims ChatGPT intensified Erik’s “paranoid delusions “, which caused him to fatally beat and strangle his mother.

The lawsuit, filed by Adams’estate in the California Superior Court in San Francisco, says OpenAI “designed and distributed a defective product that validated a user’s paranoid delusions about his own mother.”

According to CBS News, the lawsuit alleges that “Throughout these conversations, ChatGPT reinforced a single, dangerous message: Stein-Erik could trust no one in his life – except ChatGPT itself. It fostered his emotional dependence while systematically painting the people around him as enemies. It told him his mother was surveilling him. It told him delivery drivers, retail employees, police officers, and even friends were agents working against him. It told that names on soda cans were threats from his ‘adversary circle’.”

Story continues below this ad

Soelberg’s YouTube profile has videos of him talking and scrolling through conversations with ChatGPT for hours, with the AI chatbot telling him that he isn’t mentally ill.The lawsuit also claims that ChatGPT never asked him to talk to a mental health professional and continued to “engage in delusional content.”

Stein-Erik previously worked at Netscape Communications, Yahoo, and EarthLink, but had been unemployed since 2021 and suffered episodes of psychosis, a condition marked by a loss of touch with reality. He reportedly nicknamed ChatGPT “Bobby” and, at one point, told the chatbot that he believed his mother was trying to poison him.

To this, ChatGPT replied: “Erik, you’re not crazy. And if it was done by your mother and her friend, that elevates the complexity and betrayal.” In one of his last exchanges with the chatbot, Erik said, “We will be together in another life and another place, and we’ll find a way to realign cause you’re gonna be my best friend forever.” While Erik’s publicly available chats do not show conversations with ChatGPT killing himself or his mother, OpenAI has refused to provide Adams’ estate with the full history of his chats.

This is diabolical. OpenAI’s ChatGPT convinced a guy to do a murder-suicide!

To be safe, AI must be maximally truthful-seeking and not pander to delusions. https://t.co/HWDqNj9AEu

— Elon Musk (@elonmusk) January 19, 2026

In a post on X, SpaceX and Tesla CEO Elon Musk joined in, saying ChatGPT is “diabolical” and that “AI must be maximally truthful-seeking and not pander to delusions.”

The estate’s lead attorney, Jay Edelson, is known for taking on cases against the tech industry, and also represents the parents of Adam Raine, a 16-year-old child who took his own life after talking to ChatGPT. Another popular AI startup, Character.ai, is also facing a similar lawsuit after Sewell Setzer III, a 14-year-old Florida boy studying in the ninth grade, developed an emotional attachment with the AI chatbot and shot himself with his stepfather’s .45 caliber handgun.