On a deeper level than small talk, of course.

  • Skye [she/her, they/them]@hexbear.net
    link
    fedilink
    English
    arrow-up
    20
    ·
    3 days ago

    The problem is that AI does absolutely not provide a clinical relationship. If your input becomes part of the LLM’s context (which it has to in order to have a conversation) it will inevitably start mirroring you in ways you might not even notice, something humans commonly (and subconsciously) respond to with trust and connection.

    Add to that that they are designed to generally agree with and enable whatever you tell them and you basically have a machine that does everything to reinforce a connection to itself and validate the parts of yourself you have concerns about.

    There are already so many stories of people spiralling because they started building rapport with an LLM and it’s hard to imagine a setting where that is more likely to occur than when you use one as your therapist

    • LangleyDominos [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      11
      ·
      3 days ago

      Given the way LLMs function, the will have a hard time with therapy. Chat GPT’s context window is 128k tokens. As you chat, your prompts/replies add up and start filling the context window. GPT also has to look at its own responses for context. That fills up the window as well. LLMs suck with nearly empty context windows and nearly full context windows. When you’re close to having a full context window, it will start hallucinating and having problems with responses. Eventually it will only be able to focus on parts of your conversations because you’ve blown past the 128k token mark.

      The ways to mitigate this problem have to be done by the user and they disrupt therapy.

    • purpleworm [none/use name]@hexbear.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 days ago

      There are already so many stories of people spiralling because they started building rapport with an LLM and it’s hard to imagine a setting where that is more likely to occur than when you use one as your therapist

      There are multiple cases where an LLM is alleged to have contributed to someone’s suicide, from supporting sentiments of the afterlife being better to giving practical advice.