• RandAlThor@lemmy.caOP
    link
    fedilink
    arrow-up
    32
    ·
    6 days ago

    An April MIT study found AI Large Language Models (LLM) encourage delusional thinking, likely due to their tendency to flatter and agree with users rather than pushing back or providing objective information. Some AI experts say this sycophancy is not a flaw of LLMs, but a deliberate design choice to manipulate users into addictive behaviour that profits tech companies.

    • kent_eh@lemmy.ca
      link
      fedilink
      English
      arrow-up
      15
      ·
      6 days ago

      Add it to the pile of reasons why the rapid mass adoption of these LLMs and pretending they are AGI is a really bad idea.

      • ganryuu@lemmy.ca
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        I haven’t seen anyone, even the worst of them, pretend we’re already at AGIs. Granted some of them pretend we’re getting close to AGIs, which is an outrageous lie, but a different one.

        • Nik282000@lemmy.ca
          link
          fedilink
          arrow-up
          9
          ·
          6 days ago

          Management. Every middle management twit I meet thinks that LLMs are thinking, reasoning, minds that can answer every question. They are all frothing at the idea that they can replace employees with an AI that never takes time off or talks back.

    • Showroom7561@lemmy.ca
      link
      fedilink
      arrow-up
      5
      ·
      6 days ago

      An April MIT study found AI Large Language Models (LLM) encourage delusional thinking… … is not a flaw of LLMs, but a deliberate design choice to manipulate users into addictive behaviour that profits tech companies.

      Just yesterday, as I was messing around with a local LLM to see how well it does speech-to-text (not to answer any questions), I came across a voice (text to speech) that was basically a woman speaking in ASMR.

      I’ll be honest, it was soothing to listen to, and if I were one of those guys who throw money at ASMR talent (onlyfans?), then I can see how this could become quite addictive.

      This is 100% by design, and if this LLM voice had an avatar of a woman character you find attractive, you’d be fucked.