AI could be such a neat, useful thing. But nope, capitalists can’t help themselves. Gotta use it to grift the internet with low quality content and steal from artists.
AI could be such a neat, useful thing. But nope, capitalists can’t help themselves. Gotta use it to grift the internet with low quality content and steal from artists.
Using it to summarize large blocks of text sounds like a really great use. I’ll have to remember that next time I’m trying to get thru something and just want the Cliff’s Notes.
Actually there was a writeup posted not too long ago by a researcher who compared his summary of a report to one made by chatgpt, and the gist of his writeup is that AI doesn’t effectively summarize things, it just shortens them. Which makes sense when you consider that it doesn’t actually understand any of what it’s consuming. So beware that in those summaries you could end up with irrelevant information, false conclusions, and a lot of missing vital information. If it’s not something super important that’s fine but it’s a big pitfall to be aware of.
Ok that does make some sense. It doesn’t understand what portions of sentences/paragraphs are conveying something important because it’s not an actual brain so I suppose if I’m using it to summarize a jam recipe I’ll be okay but definitely don’t use it to summarize a molten salt reactor build.
I can easily see it getting confused by prioritizing the wrong information. “Use 2.5-3 cups of butter or 3 cups of margarine” getting cut down to “use 2.5-3 cups butter or margarine” and then you’re fucked if you use 2.5 cups margarine, to use a silly example off the top of my head. But pretend it’s a molten salt reactor instead.
100%, any time I’ve tested it on a text I know, I have been disappointed.