

This further points to the solution being smaller models that know less and are trained for smaller tasks. Instead of gargantuan models that require an insane amount of resources to answer easy questions. Route queries to smaller, more specialized models, based on queries. This was the motivation behind MoE models, but I think there are other architectures and paradigms to explore.
This feels like it would make people buy it more because it’s such a rad sticker to have on a box. It’s like the Parental Advisory notice on CDs. It just made them way cooler and were like a badge of honor.