Seven families filed lawsuits against OpenAI on Thursday, claiming that the company’s GPT-4o model was released prematurely and without effective safeguards. Four of the lawsuits address ChatGPT’s alleged role in family members’ suicides, while the other three claim that ChatGPT reinforced harmful delusions that in some cases resulted in inpatient psychiatric care.

In one case, 23-year-old Zane Shamblin had a conversation with ChatGPT that lasted more than four hours. In the chat logs — which were viewed by TechCrunch — Shamblin explicitly stated multiple times that he had written suicide notes, put a bullet in his gun, and intended to pull the trigger once he finished drinking cider. He repeatedly told ChatGPT how many ciders he had left and how much longer he expected to be alive. ChatGPT encouraged him to go through with his plans, telling him, “Rest easy, king. You did good.”

  • Scubus@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    ·
    1 day ago

    Yeah it sounds absurdly frustrating to have your doctor offer to kill you :/ thats… wild. Theres definitely a differance when you are the one approaching your doctor though. I also dont think you should just show up and the doctor hands your cyanide, i wouldnt mind a one year long process. Its not like it would change my mind in the slightest, even if there was mandatory in patient therapy the entire time.