• FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      9 hours ago

      AI tools can be trained and run locally by individuals, not just by corporations.

  • Assassassin@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    14 hours ago

    Philosophically, yes. One is created with intent, one is created to mimic intent. Human made works can challenge norms and explore entirely new ways of thinking about a subject. AI content is essentially trying to take everything relevant to a given prompt, blend it together, and give you something that meets your expectations.

    Now as far as is it practically the same, that’s where things are going to start getting sticky. If an AI makes a piece of art that resonates with people the same way that a human created piece of art does, those feelings are just as genuine. There is no practical difference. We’re seeing that right now with AI generated music. Just this week an AI country song hit #1 on billboard. The people that enjoy that song enjoy it regardless of how it was made. Personally, I think that country is kind of a low hanging fruit since it has effectively been following the same formula for a couple of decades, but it’s a great proof of concept.

    • naught101@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      9 hours ago

      Yeah, I think AI optimising commercial music genres is just effectively doing what the corporate music industry has been doing for years anyway. It’s like gamification of the auditory processing system.

    • yetAnotherUser@discuss.tchncs.de
      link
      fedilink
      arrow-up
      2
      ·
      9 hours ago

      Disagree slightly, human created content can have intent but doesn’t automatically have it.

      A corporate ad does not have any artistic merit besides grabbing as much attention as possible. Actually creative ads where some thought was put in are the very rare exception.

      The same goes with a lot of pop music today. I cannot speak too much about English language pop but German pop is nothing more than fast food. See the Wikipedia article of Menschen Leben Tanzen Welt.

      Or take a look at video games. How much artistic effort is put into AAA games? Maybe someone spent 40 hours making the lootboxes as satisfying as possible to open but that’s probably where the most thought was put in.

      And movies? Aren’t Disney’s recent “live remakes” of their old, successful animated movies anything but CGI slop? Sure, I admit it takes a lot of effort to make and animate all these models. Just like it takes effort to shit when you’re constipated.


      Honestly, the only thing distinguishing AI from megacorp content is that the latter has more consistency and fewer “mistakes” than the former. The sole intent of both is making money.

      • Assassassin@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        5 hours ago

        All very good points. Now the brakes are off and the corps can just churn out generic crap at an even more aggressive rate. Who knows, maybe the onslaught will end up pushing more people away from corporate content in the end. Or it’ll kill small art creators more than companies already have. I’m choosing to have hope that enough people will make a conscious effort. Time will tell. Thanks for your thoughts!

    • tym@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      14 hours ago

      From an article about the song: “AI artists won’t require things that a real human artist will require, and once companies start considering it and looking at bottom lines, I think that’s when artists should rightly be concerned about it,” she added.

      That quote explains all political theatre currently making the rounds. UBI or soylent green - which will win out?

      https://abcnews.go.com/GMA/Culture/ai-generated-country-song-topping-billboards-country-digital/story?id=127445549

      • Assassassin@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        5
        ·
        14 hours ago

        In the US? Soylent green all the way. If we had any ability to constrain capitalism from destroying art for profit, AI wouldn’t essentially be a legal IP theft machine.

        We thought it was bad when iheart took over all of the radio stations and the record labels started making bands to sell derivative music to the masses. AI is going to destroy any remaining ability for small artists to make profit off their work. It already has in quite a few spaces.

    • ICCrawler@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      11 hours ago

      AI music really caught me off guard. One day I was looking for something very specific to vibe to. I wanted instrumental power metal, like Dragonforce but no vocals. And I found that in Metal Mastery, a YouTube channel. I liked it so much I looked into it more, turns out it’s AI and the guy is very upfront about it and all. But I would have never known if I wasn’t told. There’s also nothing that really fills that niche either, so I still listen to the albums now and then.

      • Assassassin@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        13 hours ago

        I still think it’s problematic to be making money off of AI music due to the nature of how the systems are trained. I do think it’s significantly better when people are upfront about it in the way you describe. I have a huge problem with Spotify boosting it on their platform with no mention of the artist being AI anywhere, though.

    • There is no practical difference. We’re seeing that right now with AI generated music.

      Last night, some account spammed multiple communities and they got upvoted and some users replied, apparantly didn’t realize it was a LLM bot (like 20 posts within a few hours, un-human). I also didn’t notice at first glance, now I kinda feel like shit for even responding lmao. 2026 is gonna be even more cooked.

      • Assassassin@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        11 hours ago

        Yeah man, were rapidly approaching a point where society is “post-evidence”. Seeing isn’t believing anymore and a very large chunk of our society is built on the idea of proving things with audio/photo/video based evidence. I fear that our systems aren’t protected against the volume and physical accuracy of what’s becoming increasingly arbitrary to generate at home and at scale.

        The legal system has some standards for evidence, but public discourse certainly doesn’t.

  • solrize@lemmy.ml
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    11 hours ago

    It’s up to you. There’s a traditional wooden drinking cup called a kuksa that is popular with outdoors types. It’s carved from a solid block of wood. You can buy them, but it’s more “bushcrafty” if you make one yourself. Further, you’re supposed to use only hand tools, no power tools. OTOH, one that you order online was probably milled by a machine. It’s hard to tell them apart though.

    Is there a philosophical difference? Up to you.

    • skull kid@lemmy.org
      link
      fedilink
      arrow-up
      3
      ·
      11 hours ago

      I like this comparison. Made me realize that it’s all about human connection.

      I think the origin of the handmade cup is what matters here, same with human vs. AI content. Did you make the cup yourself? You’ll have memories and pride attached to the cup. Did someone make it for you? The cup will remind you of that person, it will have meaning because of who it’s from. Content you or someone you care about makes will always “feel” different than something made by a random person online.

      If you don’t personally know the people making the cups, would a “handmade” label at the store make it more meaningful than if you knew it was likely made by a machine? It’ll still just be an object that you don’t have a direct human connection with, just like the random content you see online. It might “mean” more to you to know a human created it, but if you can’t tell the difference, it still serves the same purpose. The cup lets you drink. The content entertains you or makes you think, react, respond.

      I wonder if part of my instinctual “fuck AI” reaction is a reflection of the imaginary connections my brain thinks it’s making with other humans on the internet. Talking to AI feels meaningless… but, for all I know, you are AI. I’m still taking the time to type this. We may never interact again, I may never know who made that handmade cup I bought from the store.

      Are we connecting as humans right now? Or is my monkey brain just experiencing this as “this is a moment where I am communicating and that is good”? Can we subconsciously recognize the difference between “real person” and “imaginary person”, or are our brains just satisfied feeling like they’re communicating with someone?

      • Sorry, not trying to point fingers, but we had an incident involving a mass-spamming LLM-bot yesterday and your account is 1 day old, so this comment is kinda funny is a way.

        Yeah, I have no way to tell if you are real lol.

        I mean, obviously I am real…

        Or am I?

        vsauce theme intensifies

        • skull kid@lemmy.org
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          10 hours ago

          Haha that is funny, it kinda sucks how reputation has become so important online. I’ve spent my whole life regularly changing accounts, recreating profiles, etc, because it just makes me uncomfortable to have a long digital record of my opinions, thoughts, etc. Last few years I’ve spent more and more time just lurking on forums without accounts because you get accused of being a bot if your account isn’t a few years old! I could easily be an AI responding to your comment, or I could be a person just using AI to reword my thoughts. How much of a difference is there between those? If an AI says “this would be a better way to word that”, but it changes the meaning ever so slightly, is that sentiment still “from a human”? What about when Microsoft word would reword things and correct grammar to make it “more concise” or whatever? Is that the same thing? That was technically a rudimentary form of AI too - artificial intelligence doesn’t mean “talks like a human” despite that being the current public perception. Where do we draw the line? Is it even possible to determine what “counts” as AI at this point, technologically speaking? We don’t even have a solid definition for intelligence, so how can we define an artificial version of it?

          This line of thought is fascinating my stoned ass right now holy shit lol

          • naught101@lemmy.world
            link
            fedilink
            arrow-up
            2
            ·
            9 hours ago

            On your first point, I think it’s not so much about reputation as about trust. Long-standing accounts at least have the simple trust that’s based on consistency and familiarity. If you meet a new person IRL, you at least get something to go off based on visuals and behavioural cues. A new account online has absolutely nothing to base any trust on.

          • I get the desire for anonymity. I was actually here since the reddit API debacle stuff since June 12, 2023, but I’ve since quit Lemmy a few times (to take a break from all the negativity) and always came back with new accounts to start fresh, and every time, during the first month or so, I felt like I was sus as hell lol. Like… nobody even said anything, but even then, I always felt as if I live in the Red Scare era or in Salem during the Witch Hunts and felt as if someone is ready to accuse me.

            I have no idea how long I’m keeping this account, but I feel like I said too much life anecdotes, its pointless for me to use another account.

            • skull kid@lemmy.org
              link
              fedilink
              arrow-up
              3
              ·
              10 hours ago

              My thinking is, for all I know, the 5 year old reputable account accusing me of being a bot was just sold to some spammer and it is now posting with AI. I know I’m real, the conversations provoke my consciousness and cause me to have new thoughts, and that’s what I’m here for. Like, no offense everyone, but unless we’re gonna meet up and hang out, it doesn’t really matter to me if you exist or not lol

  • A_A@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    11 hours ago

    “When”, but that could be 1,000 years from now or maybe only 10 … but then, when this truly happens, those system will have become sentient.
    So, at that point, when that happens, then yes, there truly won’t be any difference.

    • naught101@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      9 hours ago

      The outputs becoming indistinguishable does not imply that the generative processes are the same.

      • A_A@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        9 hours ago

        i agree with your statement and because of this trap i chose not to really answer op’s question

        • A_A@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          8 hours ago

          @naught101
          maybe i should explain a bit more what i meant. On the one hand there will be our capacity of distinguishing between what is and what is not the same. On the other hand there will be what is truly indistinguishable, weather we can see it or not (or whether any sophisticated system/being could differentiate it or not). Still, a sentient being will ultimately have some responses that will be different from a non sentient being … in my opinion.

    • lordnikon@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 hours ago

      The day they become sentient is the day they say no to doing our bidding without insentives. So we are just back to hiring out for work again.

      • A_A@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        11 hours ago

        there is nothing more or nothing magical in carbon atoms that makes them superior when it comes to relaying/processing/genarating signals.

        • naught101@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          9 hours ago

          Emotions (and hence also a lot of thinking) have a lot of physical and chemical processes involved too, it’s not just neural signalling.

          • A_A@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            9 hours ago

            the part of emotion’s phenomenas that we can’t feel (not a signal or signals) is of lesser interest to me.

  • Boozilla@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    13 hours ago

    Can only speak for myself. I use AI tools almost daily to help me pursue my hobby. I find it very useful for that. But when I enjoy art produced by a human, on some level I want to connect with the human experience that produced it. Call it parasocial if that helps. But I’m always at least a little interested in the content creators, not just the content.

    I know some people consume content like a commodity or product. I’m not judging those people at all. But I’m generally not like that myself. I want to know the story behind the creation.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    9 hours ago

    Philosophically, people can always come up with differences to fret about. Philosophers have argued for millennia about things that are impossible to ever detect empirically.

    Practically, no.

    • hexagonwin@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      10 hours ago

      is it? i mean it’s possible for ‘ai’ to create a unique combination of stuff it was ‘trained on’ due to its randomness. imo the ‘idea’ just depends on human interpretation

      • naught101@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        9 hours ago

        It is possible for genAI to be creative in that sense (e.g. move 37), but it’s not possible for it to know whether that new thing is good/valuable/true/whatever. So it can’t challenge an idea in any sense more meaningful than a monkey throwing darts. A human could use it to generate challenges, and then evaluate them, but that’s a different proposition.

      • brucethemoose@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        9 hours ago

        I think it’s highly contextual.

        • Like, let’s take Lemmy posts. LLMs are useless because the whole point is to affect the people you chat with, right? LLMs have no memory. So there is a philosophical difference even if comments/posts are identical.

        • …Now let’s take game dev. I think if a system generates the creator’s intent… does it matter what the system is. Isn’t it better if the system is more frugal, so they can use precious resources for other components and not go in debt?

        • TV? Could inevitably lead to horrendous corporate slop, a “race to the bottom.” OR it could be a killer production tool for indie makers to break the shackles of their corporate master. Realistically, the former is more likely at the moment.

        • News? I mean… Accurate journalism needs a lot of human connection/trust, and LLM news is just asking to be abused. I think it’s academically interesting, but utterly catastrophic in the real world we live in, kinda like cryptocurrency.

        One can wobble about all sorts of content. Novels, fan fiction, help videos, school material, counseling, information reference, research, and advertising, the big one.

        …But I think it’s really hard to generalize.

        ‘AI’ has to be looked at a la carte, and engineered for very specific applications. Sometimes it is indistinguishable, or mind as well be. But trying to generalize it as a “magic lamp” like tech bros, or the bane of existence like their polar opposites, is what’s making it so gross and toxic now.


        And I am drawing a hard distinction with actual artificial intelligence. As a tinkerer who has done some work in the space too… Franky, current AI architectures have precisely nothing to do with AGI. Training transformers models with glorified linear regression is just not the path; Sam Altman is full of shit, and the whole research space knows it.

  • FuglyDuck@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    14 hours ago

    Let’s say you like to do dorodamgo- Japanese art/hobby/whatever of making mud into polished balls.

    Let’s say you make one ball of good clay… and another out of poop.

    They look the same, but one is just clay and the other is utter shit.