• FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    17
    arrow-down
    4
    ·
    7 months ago

    You’re providing a prime example of misunderstanding. The term AI has been in professional use for a wide variety of algorithms, including machine learning and neural nets like LLMs, since the Dartmouth conference in 1956. It’s the people who only know what AI is like from Star Trek and other such sources that are misinformed.

    • xmunk@sh.itjust.works
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      7 months ago

      As a programmer intimately familiar with LLMs and training evaluation… would you mind if I rephrased your comment to “How dare you use the common meaning of our obscure industry jargon that’s mostly just marketing bullshit anyways!”

      The ship for “What does AI mean?” has fucking sailed. AI is an awful term that, in my experience, is vanishingly rarely used by developers outside of “Robots that will kill us” and “Marketing bullshit” th3 term needs to die - it implies something much closer to “AGI in a mechasuit with miniguns” rather than “My python code can recognize fuzzy numbers!”

      • howrar@lemmy.ca
        link
        fedilink
        arrow-up
        3
        ·
        7 months ago

        Do we have a better word for what has historically been known as AI? I see lots of complaints about X not being AI, but no proposal for what to call them.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          3
          ·
          7 months ago

          I don’t know what you mean by “historical”, because the stuff we’ve got now is what is historically known as AI.

          If you mean the Star Trek stuff, though, then the specific terms for those are AGI (Artificial General Intelligence, an AI that’s capable of doing basically everything a human can) and ASI (Artificial Super Intelligence, an AI that’s capable of doing more than what a human can).

          We don’t have AGI yet, but there’s no reason to assume we can’t eventually figure it out. Brains are made of matter, so by fiddling with bits of matter we should eventually be able to make it do whatever a brain can. We have an example showing what’s possible, we just need to figure out how to make one of our own.

          ASI is a little more speculative since we don’t have any known examples of naturally-occurring superintelligence. But I also think it’s a bit unlikely that humans just happen to be the smartest things that can exist.

          • Nik282000@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            7 months ago

            If you mean the Star Trek stuff, though, then the specific terms for those are AGI

            Even in Star Trek only Data, Lore (and Peanut-hamper) were intelligent, all the computers ran on what is being called ‘AI’ now. Massive DBs and search algorithms.

            • FaceDeer@fedia.io
              link
              fedilink
              arrow-up
              2
              ·
              7 months ago

              The ship’s computer could whip up an AGI (Moriarty) in response to a simple command. The Federation later systematized this in the form of emergency holographic officers.

            • xmunk@sh.itjust.works
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              7 months ago

              Search algorithms are, depending on the specifics, potentially “ai” now. If we’re tokenizing out vectors and running a straight match query (i.e. postgres full text search) that’s not AI - that’s just string matching. Some of the offerings get into NN guided or LLM powered… these tend to suck though because they’re unpredictable and inconsistent. That may just be the novelty of the technology though, we’ve had decades to work on small word exclusion and language specific dictionary mapping - it’s possible the consistency will get up and, at least when it comes to searching, everything really good already uses weird heuristics so it’s not like we can reason on why specific results are preferred, we just know they’re consistent.

          • howrar@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            7 months ago

            the stuff we’ve got now is what is historically known as AI.

            Yeah, and people are complaining that we shouldn’t call it AI anymore because the colloquial usage of the word has changed, so I want to know what alternatives exist.

              • howrar@lemmy.ca
                link
                fedilink
                arrow-up
                1
                ·
                7 months ago

                Yes, you’ve provided the terms that I’m familiar with. That’s not what I’m asking for though. I’m asking for alternatives from people who don’t agree with this terminology.

                • FaceDeer@fedia.io
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  7 months ago

                  Make something up and try to get it popular enough to matter, if you refuse to use the terms that have already gained traction and that you’re familiar with. As far as I’m aware there’s just AGI and ASI.

          • howrar@lemmy.ca
            link
            fedilink
            arrow-up
            3
            ·
            7 months ago

            Those are all very narrow subtopics within AI. A replacement term for “AI” would have to be more general and include the things you’ve listed.

            • Nik282000@lemmy.ca
              link
              fedilink
              arrow-up
              3
              ·
              7 months ago

              Nondeterministic Computing. There is no intelligence in what is now called ‘AI’.

              • FaceDeer@fedia.io
                link
                fedilink
                arrow-up
                3
                ·
                7 months ago

                That’s even more “wrong,” though. Plenty of AI is deterministic, and plenty of nondeterministic computing isn’t AI.

              • howrar@lemmy.ca
                link
                fedilink
                arrow-up
                1
                ·
                7 months ago

                Counterexample: There exists an optimal deterministic policy for any MDP.

        • corsicanguppy@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          Prep-cook. Magician’s Apprentice. Understudy. Artificial intern.

          I use these tools to barf code examples. It’s like asking the prep-cook to get the stock going so you can do other things.

    • cygnus@lemmy.ca
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      edit-2
      7 months ago

      You’re technically correct, but these products are marketed as though they were like Star Trek’s computer. Do you think it’s a coincidence that Google Assistant’s codename was “Project Majel”?

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        7 months ago

        I don’t see what this has to do with the meaning of the term AI.

        If a marketing department somewhere starts trying to push the slogan “ice cream for your car!” As a way to sell gasoline, would it make sense to start complaining that the stuff Ben & Jerry’s is selling isn’t actually gasoline?

        • cygnus@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          This is more like all petro-companies and gas stations all coordinated to call gasoline “ice cream” and media pick up the term as well, so that everybody suddenly starts calling gasoline ice cream. Some of us are on the sidelines reminding people that “ice cream” has a distinct and different meaning.