In my opinion, AI just feels like the logical next step for capitalist exploitation and destruction of culture. Generative AI is (in most cases) just a fancy way for cooperations to steal art on a scale, that hasn’t been possible before. And then they use AI to fill the internet with slop and misinformation and actual artists are getting fired from their jobs, because the company replaces them with an AI, that was trained on their original art. Because of these reasons and some others, it just feels wrong to me, to be using AI in such a manner, when this community should be about inclusion and kindness. Wouldn’t it be much cooler, if we commissioned an actual artist for the banner or find a nice existing artwork (where the licence fits, of course)? I would love to hear your thoughts!

  • patatas@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    2 days ago

    The products of artisanal labour and factory labour might indeed be able to be equivalent in terms of the end product’s use value, but they are not equivalent as far as the worker is concerned; the same loss of autonomy, the loss of opportunity for thought and problem-solving and learning and growing, these are part of the problem with capitalist social relations.

    I’m trying to say that AI has this social relation baked in, because its entire purpose is to have the user cede autonomy to the system.

    • Cowbee [he/they]@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      I’m sorry, but that doesn’t make any sense. AI is not intrinsically capitalist, it isn’t about cedeing autonomy. AI is trained on a bunch of inputs, and spits out an output based on nudging. It isn’t intrinsically capital, it’s just a tool that can do some things and can’t do others. I think the way you view capitalism is fundamentally different from the way Marxists view capitalism, and this is the crux of the miscommunication here.

      • patatas@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        2 days ago

        Literally the only thing AI does is cause its users to cede autonomy. Its only function is to act as a (poor) facsimile of human cognitive processing and resultant output (edit: perhaps more accurate to say its function is to replace decision-making). This isn’t a hammer, it’s a political artifact, as Ali Alkhatib’s essay ‘Defining AI’ describes.

        • Cowbee [he/they]@lemmy.ml
          link
          fedilink
          arrow-up
          2
          ·
          2 days ago

          AI is, quite literally, a tool that approximates an output based on its training and prompting. It isn’t a political artifact or anything metaphysical.

          • patatas@sh.itjust.works
            link
            fedilink
            arrow-up
            1
            ·
            2 days ago

            AI is a process, a way of producing, it is not a tool. The assumptions baked into that process are political in nature.

            • Cowbee [he/they]@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              2 days ago

              I really don’t follow, something like Deepseek is quite literally a program trained on inputs that spits out an output depending on prompts. It isn’t inherently political, in that its relation to production depends on the social role it plays. Again, a hammer isn’t always capital, it is if that’s the role it plays.

              • patatas@sh.itjust.works
                link
                fedilink
                arrow-up
                1
                ·
                edit-2
                2 days ago

                And that social role is, at least in part, to advance the idea that communication and cognition can be replicated by statistically analyzing an enormous amount of input text, while ignoring the human and social context and conditions that actual communication takes place in. How can that not be considered political?

                • Cowbee [he/they]@lemmy.ml
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  2 days ago

                  The social role of a tool depends on its relation to the overarching mode of production, it isn’t a static thing intrinsic to a tool. AI doesn’t care about advancing any ideas, it’s just a thing that exists, and its use is up to how humans use it. This seems to be all very idealist and not materialist of you.

                  • patatas@sh.itjust.works
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    edit-2
                    2 days ago

                    If I made a tool which literally said to you, out loud in a tinny computerised voice, “cognitive effort isn’t worth it, why are you bothering to try”, would it be fair to say it was putting forward the idea that cognitive effort isn’t worth it and why bother trying?

                    If so, what’s the difference when that statement is implied by the functioning of the AI system?