- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
At one point it started hallucinating about being a real human and meeting someone in person.
Shows how useful and superior over a real human it is.
At one point it started hallucinating about being a real human and meeting someone in person.
Shows how useful and superior over a real human it is.
Haha it’s a bit too human apparently. Starts freaking out and having a nervous breakdown when things keep going wrong. Maybe the thing’s already sentient and we’re basically just torturing it for our amusement? That surely won’t backfire once it actually gains new abilities.
Also:
This presents… possibilities for all commercial entities who will eventually use AI agents for storefronts and online sales.
That’s the worst part.
It didn’t actually freak out. It’s trained on human-created material. It saw a situation where a human would freak out, and internally was like “huh…well, it looks like the pattern here is to freak out. OK, I guess…”
These things are trained to emmulate patterns of existing things, so it’s only because most humans freaked out at things going wrong, does it follow the same pattern because it thinks that’s what the operators expected to see.
Dear valued customer,
Please enjoy this delicious tungsten cube to consume with your <bag of chips>.
Jokes on you, I’m here for the cube.
And hiring.