Faced with new legislation, Iowa’s Mason City Community School District asked ChatGPT if certain books ‘contain a description or depiction of a sex act.’

  • RotaryKeyboard@lemmy.ninja
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    Regardless of whether or not any of the titles do or do not contain said content, ChatGPT’s varying responses highlight troubling deficiencies of accuracy, analysis, and consistency. A repeat inquiry regarding The Kite Runner, for example, gives contradictory answers. In one response, ChatGPT deems Khaled Hosseini’s novel to contain “little to no explicit sexual content.” Upon a separate follow-up, the LLM affirms the book “does contain a description of a sexual assault.”

    On the one hand, the possibility that ChatGPT will hallucinate that an appropriate book is inappropriate is a big problem. But on the other hand, making high-profile mistakes like this keeps the practice in the news and keeps showing how bad it is to ban books, so maybe it has a silver lining.

    • StringTheory@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      “Hallucinate” seems like an odd vocabulary choice when talking about an AI. It implies much more sentience than an AI can possibly have, plus the ability to spontaneously create from whole-cloth. (Which AI can’t do, at all.)

      I feel like our brave new culture needs a different word for the non-sensical/inaccurate products of AI; something with the flavors of “assemble” “fabricate” “construct” “collate” “collage” “grab-bag”.

      Our vocabulary isn’t keeping up with technology. Is there a linguist in the house? We need more words!