Just heard this today - co worker used an LLM to find him some shoes that fit. Basically prompted it to find specific shoes that fit wider/narrower feet and it just scraped reviews and told him what to get. I guess it worked perfectly for him.

I hate this sort of thing - however this is why normies love LLM’s. It’s going to be the new way every single person uses the internet. Hell, on a new win 11 install the first thing that comes up is copilot saying “hey use me i’m better than google!!”

Frustrating.

  • theunknownmuncher@lemmy.world
    link
    fedilink
    arrow-up
    38
    ·
    edit-2
    1 day ago

    hey use me i’m better than google

    I don’t use copilot (or windows), nor do I believe that LLMs are appropriate for use as a search engine replacement, but to be fair, google is really bad now, and I wouldn’t be surprised if people are having a better experience using LLMs than google.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 day ago

      I have a feeling google is slowly reducing its own quality so using llms become the norm, since it’s even easier to inject ads. Might be paranoia tho.

      • deliriousdreams@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        8 hours ago

        Google has already been caught out doing this. They reduced the quality of search results and placed ads and SEO (companies that pay to be first in the SEO rankings) ahead of other results. This was happening before they had a Gen AI LLM.

        It’s intent is to keep you on the search page longer, viewing ads so they can get more ad revenue.

        They’re an ad aggregation company first and foremost and search (along with their other suite of products) is how they serve those ads.

      • nfreak@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        It’s also easier to inject propaganda. Look at Grok - extreme example, sure, but it shows exactly what these are designed to do.

        • theunknownmuncher@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          24 hours ago

          Nah, that’s silly. Google search ranking is definitely just as easy if not easier for them to manipulate and push specific content than dealing with a non deterministic LLM.

          Both are propaganda machines, but traditional search algorithms are way more direct than LLMs.

    • thesohoriots@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      1 day ago

      I think the other half of this is the confidence with which it’s programmed to give the answer. You don’t have to go through a couple individual answers from different sites and make a decision. It just tells you what to do in a coherent way, eliminating any autonomy you have in the decision process/critical thinking, of course. “Fix my sauce by adding lemon? Ok! Add a bay leaf and kill yourself? Can do!”