I’m genuinely surprised it’s not mentioned in the hover text … though I think you can see it in between the lines there.
A little bit of neuroscience and a little bit of computing
I’m genuinely surprised it’s not mentioned in the hover text … though I think you can see it in between the lines there.


Glad to hear! And interested to know how you find it.


I’ll add to all the recommendations to read Hyperion … just do it, seriously.
But I’ll also add that the sequel, while it splits people, contributes on the internet & AI sci fi front and mid probably worth a read if you’re enjoying those aspects of the first book. Generally, together, I think they’re great commentaries on modern tech, especially for books from the 80s.


A shallow response I know, but … what the hell timeline is this!!??


I’m afraid what you wrote was not purely what you state here, and to the extent it was, poorly so.
I don’t think I’m the one here for whom the message is lost.


Ha, ok. Well I think your response is rude, meaningless, petulant, and a waste of space.
But I’m really glad we have people like you on social media to remind us what it’s all about.


Many, myself included, may not like to hear this, but I think it’s the bitter truth.
For better or worse, the majority like this technology. AI companies have stuck the landing in a sales sense.
For those who find it cringey or offensive or whatever, we may have to get used to being black sheep (even more).


Huh, didn’t know. Thanks!


Recently rewatched it and looked it up on Wikipedia afterwards … and was also surprised it was a box office flop.
Which just affirmed for me how real the “did well on VHS/DVD/TV” thing is … because I was too young to see it at the cinema but definitely knew all about it as a kid and always liked it.
Someone must have been showing it to me knowing there was audience!
I was also pleasantly surprised at how much I enjoyed on rewatch (after many years). The closing third drags a bit I think … but the opening half is really tight and interesting story telling. The way it uses flash backs to Scotland to explain what’s actually going on in the present worked really for me.


Techno feudalism … seems plain and simple to me.
Our independent value and sustainability is no longer a given.
In a monopolised AI world (and how can it be anything other than a big tech monopoly) … you give yourself over, as training data, in exchange for permission to survive … and rely on the AI trained on your data.
Let’s be real … big tech cornered us over the past couple of decades. And now they’re trying to grab us by the balls. It’s happening fast. And most don’t have the philosophical agility to keep up with the implications.


From what I’m seeing, soooo many are naive to this dynamic. They think of it like it’s the latest nifty app and not the directed disruption of the labour market that it is.
Almost like thinking and social awareness has already been outsourced to big tech’s social media empire and this is just the next step.


Interesting. I saw it only once, in the cinemas, and liked it very much and appreciated its glowing reception. But I always wondered if it would fare poorly on rewatch and become a bit like American beauty in making sense really only a moment.


I think they mean in parallel, as in the government steps in and regulates with guarantees etc, not that these reforms would come from the AI itself.


I keep saying that AI is the death of the Internet as we know it. It’s just no longer the same thing at all.
Completely flipping and question how everything we do on it should probably be the default stance.


Also, Siri, Alexa and Cortana were seen as “intelligent” at the time, as well (or were supposed to be seen, depending on who you ask).
Intelligent for the time, sure, but ever pitched as doing more than a Secretary that never encroaches on or gets involved with your actual job and cognitive skills? Because that’s the divide that’s being enforced: women for the menial dumb tasks and men for the serious, difficult and actually valuable and important stuff.


Not blaming anyone, this is social commentary.
But like the neutral “it” is right there.
In a world that’s both charged around gender and pronoun usage, and focused on the nature and value of LLMs … I think it’s weird that there isn’t more commonly pushback enforcing the non-human neutral for the simple reason that it’s an objective fact amidst a swampy pool of (mis-)information synthesis.
A little like the bechdel test, I feel like it’s the casualness and indifference around this gender bias (at least at the moment) that’s interesting and telling.


Couldn’t help but notice the casual gendering of Claude to “he” as well.
Someone somewhere made the important observation not long ago that computer assistants tended to be gendered female when more like a secretary (Siri and Alexa) but now that AIs are “intelligent” and powerful … Claude now has to be a male.
Especially weird (and telling?) when it is objectively gender neutral as it’s not human.


Most notable part for me in the article was not the AI stuff … but that Atlassian has never been profitable.
Not surprising for a tech company. But for one as big and kinda foundational in the service it provides … I found it surprising. Imagine if MS or Apple or Google were never profitable and companies were just entirely reliant on their services!
Couple that with how little love anyone has for Jira/confluence … and yea … good luck with that Atlassian.


I mean, it makes sense that it’s addictive, right?
I also suspect it’s one of those things that just naturally splits people. For some, the addictiveness and appeal just don’t make sense. For others it’s irresistible.
It’s part of the reason why I’m so doomer on the state of things, from a generally anti-AI/sceptical perspective. There’s just something compulsive that this kind of tool triggers in many people.
Yea this vibes.
For me, it felt like coding was a more attractive term for people who weren’t “proper” computer science and “engineering” types who weren’t confident that they knew what “program” meant or even “algorithm”, as they were working things as they went.
I’d guess that as computing involved more and more people with this non-standard background, coding became preferred. I certainly encountered people uncomfortable with my casual use of “algorithm” because it triggered their imposter syndrome, and my pointing out that they write algorithms with the code the write all the time certainly didn’t help.