yeah, they probably had a system prompt setup to make sure it responded appropriately, something like, we’re about to make a japanese BBQ sauce from the ingredients on the table, when we ask about it, use the camera to look at and remark about the ingredients on the table.
But AI distills noise, and it has memory so it probably just mashed the gas this time. The part where it was supposed to recognise his actions was smoke and mirrors. It’s fine at telling what’s on the table, it’s bad at recognising what he’s doing.
It probably worked fine earlier. Or they wouldn’t have done it
yeah, they probably had a system prompt setup to make sure it responded appropriately, something like, we’re about to make a japanese BBQ sauce from the ingredients on the table, when we ask about it, use the camera to look at and remark about the ingredients on the table.
But AI distills noise, and it has memory so it probably just mashed the gas this time. The part where it was supposed to recognise his actions was smoke and mirrors. It’s fine at telling what’s on the table, it’s bad at recognising what he’s doing.
I feel like him cutting it off as it was listing things on the table might’ve kinda bricked it too