Seven families filed lawsuits against OpenAI on Thursday, claiming that the company’s GPT-4o model was released prematurely and without effective safeguards. Four of the lawsuits address ChatGPT’s alleged role in family members’ suicides, while the other three claim that ChatGPT reinforced harmful delusions that in some cases resulted in inpatient psychiatric care.
In one case, 23-year-old Zane Shamblin had a conversation with ChatGPT that lasted more than four hours. In the chat logs — which were viewed by TechCrunch — Shamblin explicitly stated multiple times that he had written suicide notes, put a bullet in his gun, and intended to pull the trigger once he finished drinking cider. He repeatedly told ChatGPT how many ciders he had left and how much longer he expected to be alive. ChatGPT encouraged him to go through with his plans, telling him, “Rest easy, king. You did good.”



I hope this keeps OpenAI employees up at night. They are directly responsible for this. They could have stopped at any point and thought about the effects of their software on vulnerable people but didn’t. Maybe they should talk to ChatGPT if they feel sad about it, I am sure it has good ideas about the correct course of action.
About as much as gun manufacturer employees lose sleep over school shootings.
Yea, right, keeps them up at night hugging their money…
Have you ever tried sleeping on gold bars and cash? It’s not easy…
These people definitely are not dragons.