• CubitOom@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      Its called the piss filter.

      Generative models (what is often referred to as AI) are opaque and it’s almost impossible to understand exactly how something got added or why it’s happening. But somewhere along the way models started to use it and no one looking at the output thought it was bad enough to not post.

    • Tire@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      Yeah all AI images have a bend towards warm yellow hues. So much that If you keep feeding the output of AI back into AI it just gets more yellow over time. It’s probably something to do with training sets taking in movies and social media posts where people prefer to show themselves in “golden hour”.

    • hasnt_seen_goonies@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      I assume that part of the prompt is tying back to WW2 era propaganda and a lot of those posters have yellowed with age by the time they made it into the dataset.

    • 6nk06@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      3 days ago

      Same question. It’s weird and instantly recognizable. I guess training data but I have never seen a real explanation about that.