• huppakee@piefed.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    19 hours ago

    such a variety of failure modes

    What i find interesting is that in both cases there is a certain consistency in the mistakes too - basically every dementia patient still understands the clock is something with a circle and numbers and not a square with letters for example. LLMs can tell you cokplete bullshit, but still understands it has to be done with perfect grammar in a consistant language. So much so it struggles to respond outside of this box - ask it to insert spelling errors to look human for example.

    the ability to “see”

    This might be the true problem in both cases, both the patient and the model can not comprehend the bigger picture (a circle is divided into 12 segments, because that is how we deconstructed the time it takes for the earth to spin around it’s axis). Things that seem logical to use, are logical because of these kind of connections with other things we know and comprehend.