People hating on this overly agreeable stance from LLMs, but in reality the executives pouring money into this stuff want “yes men” from their subordinates, including their AI subordinates.
It is absolutely a feature.
“As fun and enjoyable as possible.” Yet another lie.
https://30p87.de/shitskibidi.mp4, if you don’t want to access ShitTok
Thanks, I also like offtiktok.com (replace tiktok in the URL with offtiktok)
I’m posting this because it’s a great example of how LLMs do not actually have their own thoughts, or any sort of awareness of what is actually happening in a conversation.
It also shows how completely useless it is to have a “conversation” with someone who is just in agreeability mode the whole time (a.k.a. “maximizing engagement mode”) – offering up none of their own thoughts, but just continually prompting you to keep talking. And honestly, some people act that way too. And other kinds of people crave a conversation partner who acts that way, because it makes them the center of attention. It makes you feel interesting when the other person is endlessly saying, “You’re right! Go on.”
LLMs work best when used as a mirror to help you reflect on your own thoughts, as that’s the only thinking going into the process. This isn’t how most people are using it.
this is an effect known for almost 60 years now https://en.wikipedia.org/wiki/ELIZA_effect
That is interesting, I hadn’t heard of that.
The only winning move is not to play.
Pretty sure anyone that has ever used chatgpt could have foreseen that.
Unfortunately, I’m not sure about that. Plenty of people who use ChatGPT end up thinking that it is sentient and has its own thoughts, but that’s because they don’t realize how much they are having to drive the conversation.
I’d say the majority thinks like that.
Who needed those trees anyways
Removed by mod
That was painful to listen to. It would’ve been more interesting if he just gave them a fucking prompt and let them spiral.
Maybe it would be more interesting for a reply or two, but it would have quickly fallen right into the same spiral.
No I think it wouldve been very capable of bringing up semi-related facts and that would prompt a different response.
So chat GPT is the contrary to that depressed guy I used to date