• scott@lemmy.org
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    6
    ·
    2 months ago

    AI does not exist. Large language models are not intelligent, they are language models.

    • TranscendentalEmpire@lemmy.today
      link
      fedilink
      arrow-up
      17
      ·
      2 months ago

      This can’t be true… Businesses wouldn’t reshape their entire portfolios, spending billions of dollars on a technology with limited to no utility. Ridiculous.

      Anyways, I got these tulip bulbs to sell, real cheap, just like give me your house or something.

      • marcos@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        2 months ago

        Remember, investment in LLM infrastructure on the US is currently larger than consumer spending.

        And they will cut interest rates soon, so expect the number to go up (the investment number, that is, not value).

        • Captain_Faraday@programming.dev
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          Can confirm, I’m an electrical engineer working on a power substation supplying power to a future datacenter (not sure if an Ai project, there’s more that one). Let’s just say, money is no issue, commissioning schedule and functionality are their priorities.

      • ozymandias117@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        2 months ago

        I would argue that, prior to chatgpt’s marketing, AI did mean that.

        When talking about specific, non-general, techniques, it was called things like ML, etc.

        After openai coopted AI to mean an LLM, people started using AGI to mean what AI used to mean.

        • Ignotum@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          2 months ago

          To common people perhaps, but never in the field itself, much simpler and dumber systems than LLMs were still called AI

        • brisk@aussie.zone
          link
          fedilink
          arrow-up
          1
          ·
          2 months ago

          That would be a deeply ahistorical argument.

          https://en.wikipedia.org/wiki/AI_effect

          AI is a very old field, and has always suffered from things being excluded from popsci as soon as they are achievable and commonplace. Path finding, OCR, chess engines and decision trees are all AI applications, as are machine learning and LLMs.

          That Wikipedia article has a great line in it too

          The Bulletin of the Atomic Scientists organization views the AI effect as a worldwide strategic military threat.[4] They point out that it obscures the fact that applications of AI had already found their way into both US and Soviet militaries during the Cold War.[4]

          The discipline of Artificial Intelligence was founded in the 50s. Some of the current vibe is probably due to the “Second AI winter” of the 90s, the last time calling things AI was dangerous to your funding

            • Klear@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              1
              ·
              edit-2
              2 months ago

              So? I don’t see how that’s relevant to the point that “AI” has been used for very simple decision algorithms since for along time, and it makes no sense to not use it for LLMs too.

      • Bronzebeard@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        A thermostat is an algorithm. Maybe. Can be done mechanically. That’s not much of a decision, “is number bigger?”