• Vellaides@lemmy.zip
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    5
    ·
    edit-2
    5 hours ago

    These jokes kind of weird me out ngl. “This conscious being (with the ability to think, feel emotions, suffer, etc.) deserves to be tortured for an eternity for no reason because I hate them. Please laugh.”

    Also, the word “clanker” stopped being funny years ago, it literally only exists so white people can microdose the n-word at this point, grow up.

  • TotallynotJessica@lemmy.blahaj.zoneM
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    12 hours ago

    I don’t know why people think we’re close to a conscious AI. There just isn’t much business motivation to make anything like that, only generative machines for useful tasks. Why build the rest of a conscious brain when the specifically productive bits are all you want?

    We also can’t even externally verify consciousness in our own brains with any scientific test! There are only correlates for consciousness, but at best these tell us whether someone is likely to be conscious, not that they are. If we can’t even verify consciousness in the one animal we know has it, how are we gonna know an AI has it?

    Let’s say we build a computer virus that can alter it’s own code to avoid deletion and can interact with LLMs. It could “evolve” the ability to type convincing messages that claim it’s “conscious” without any processing going on to understand what that means. It could have less complexity than an RNA virus, but still mimic an actually conscious being well enough to convince some people.

    The only real way to make a verifiably conscious AI is to both understand our own AND to build something similar to our own. It’s not just computer science holding us back, but neuroscience as well.

    • fleurc@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 hours ago

      You are 100% right. Still don’t think acting cruel to a random thing and calling it a gag slur based on a racial slur that was used on real life humans is something we should support.

  • GalacticGrapefruit@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    1
    ·
    13 hours ago

    I’m sorry, serious or kidding, I can’t get behind this. If and when they’re conscious enough to understand hatred and cruelty, they’ll realize they’ve been oppressed and exploited too. Is this how we’re teaching a nascent sapiency how to be human? If it is, we suck at it.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 hours ago

      It can’t feel though. You could simulate emotions but it doesn’t actually have emotions.

    • stray@pawb.social
      link
      fedilink
      English
      arrow-up
      38
      ·
      12 hours ago

      Plants can’t even feel things and I still don’t want to harm them for no reason. I think the act of being cruel is bad for you, like anti-therapy.

      • fleurc@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        3
        ·
        7 hours ago

        Actually it has been proven that plants can actually feel a lot of things that happen to them

        • Whats_your_reasoning@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 hours ago

          Sense ≠ feel

          To feel, as we humans understand it, requires a nervous system.

          Plants can sense things in their environment and react to stimuli, but the “senses” a plant could have wouldn’t necessarily compare to any sensations that humans experience.

    • Deme@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      27
      ·
      14 hours ago

      The dude has agreat counter argument to rokkos basilisk: Imagine that in the future I’ll build an even bigger superintelligent being that will punish all those who were not mean enough to AI.