A new poll by the Pew Research Center has found that Americans are getting extremely fed up with artificial intelligence in their daily lives.

A whopping 53 percent of just over 5,000 US adults polled in June think that AI will “worsen people’s ability to think creatively.” Fifty percent say AI will deteriorate our ability to form meaningful relationships, while only five percent believe the reverse.

While 29 percent of respondents said they believe AI will make people better problem-solvers, 38 percent said it could worsen our ability to solve problems.

The poll highlights a growing distrust and disillusionment with AI. Average Americans are concerned about how AI tools could stifle human creativity, as the industry continues to celebrate the automation of human labor as a cost-cutting measure.

  • JohnAnthony@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    21
    ·
    2 days ago

    I still feel this whole conclusion is akin to “we won’t need money in a post-AGI world”. An implied, unproven dream of AI being so good that X happens as a result.

    If an author uses LLMs to write a book, I don’t give a fuck that they forget how to write on their own. What I do care about is that they will generate 100 terrible books in the time it takes a legitimate author to write a single one, consuming a 1000 times the resources to do so and drown out the legitimate author in the end, by sheer mass.

    • cloudy1999@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      1 day ago

      How many terrible books must I read to find the decent one? And why should I read something that nobody bothered to write? Such a senseless waste of time and resources.

      • JohnAnthony@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        5
        ·
        23 hours ago

        I completely agree: if the (hypothetical) perfect LLM wrote the perfect book/song/poem, why would I care?

        Off the top of my head, if an LLM generated Lennon’s “Imagine”, Pink Floyd’s “Goodbye Blue Skies”, or Eminem’s “Kim”, why would anyone give a fuck? If it wrote about sorrow, fear, hope, anger, or a better tomorrow, how could it matter?

        Even if it found the statistically perfect words to make you laugh, cry, or feel something in general, I don’t think it would matter. When I listen to Nirvana, The Doors, half my collection honestly, I think it is inherently about sharing a brief connection with the artist, taking a glimpse into their feelings, often rooted in a specific period in time.

        Sorry if iam14andthisisdeep, I don’t think I am quite finding the right words myself. But I’ll fuck myself with razor blades before I ask a predictive text model to formulate it for me, because the whole point is to tell you how I feel.

        • vala@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          2
          ·
          20 hours ago

          I’m a musician and have a few musical friends. This is the same conclusion we’ve all come to. People who listen very lightly to pop music might start listening to AI stuff and think nothing of it. Anyone who actually listens to music for the art and human connection will likely reject it.

    • Tartas1995@discuss.tchncs.de
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      I personally believe that in an AGI world, the rich will mistreat the former workers, that might work for a while but at some point, not only are the people fed up of the abuse but the “geniuses” who created their position of power are gone and the children or children’s children will have the wealth and power. The rest of the world will realise that there is no merit to either of there position. And the blood of millions will soak the earth and if we are lucky, AGI survives and serves the collective well. If we aren’t… oh well…

      Good thing that we aren’t there.

        • Tartas1995@discuss.tchncs.de
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 day ago

          My issue is similar but I would say,

          We are lucky if an agi actually align with our interests, or even the creator’s interests.

          • snugglesthefalse@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            50 minutes ago

            Yeah with the layers of obscurity I don’t see how anyone could shape an AGI and be sure it was aligned to their, or anyone’s, interests. Right now with the current AI it’s pretty clearly not completely under control and from what I understand of the way they’re trained there’s no way of avoiding the obfuscation of what you’re actually telling it to do. A strong ai will just as easily learn to lie about being aligned than it will be aligned.