• Awoo [she/her]@hexbear.netOP
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    12 days ago

    It’s funny because the precog system is clearly correct almost fucking ALL THE TIME. But because it can be wrong once, it’s evil.

    But they’re gonna defend the shit out of these AI solutions despite being totally untrustworthy and wrong nearly half the time.

    All this will do is reproduce racial profiling, except because it’s performed by an AI it’s ok because a human didn’t do it. “My computer told me to stop and search you”.

      • Awoo [she/her]@hexbear.netOP
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 days ago

        The algorithms having the same bias is the positive to them.

        They get to do the racism but they also get to blame something that isn’t themselves. Something that can not be fired and is not accountable to anyone.

          • Awoo [she/her]@hexbear.netOP
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            11 days ago

            It will 100% be implemented elsewhere too. American cops deporting people? “Sorry the machine says you gotta go, so sayeth the machine, not me, i’m just following the machine’s orders”

            Authority will love this shit. They become completely unaccountable. The nebulous AI decides all.