• 0 Posts
  • 290 Comments
Joined 2 years ago
cake
Cake day: June 11th, 2023

help-circle
















  • Just get Everything Search and you’ll be able to search just as fast as you could in XP, and with no Bing spam messing up the results.

    Funny¹ thing is that Everything (and similar programs like WizTree) can be that be that fast because Microsoft’s own NTFS file system has a built in file index, which is what Windows Search used back in XP; the search programs practically don’t have to do any work, NTFS has already done it for them.

    Of course, though, that’ll give you the results you want, not the results Microsoft wants, which explains the change in later further enshittified versions of Windows.

    1.– And by funny I mean not funny at all. Sad, in fact. Tragic, even, maybe.


  • I think Trump genuinely wants to be peacemaker

    Trump’s never given a flying fuck about anything other than Trump, and never will.

    He is enough of a narcissistic idiot to believe he could achieve peace in the middle east by bombing Iran, though, not because he cares about peace but because he genuinely believed it would get him the Nobel peace price, which he needs because Obama of all people got one, and extremely narcissistic Trump’s never gotten over Obama making fun of him at the 2011 White House Correspondents’ dinner.

    When both Israel and Iran immediately violated his unilateral ceasefire (to the surprise of no one but Trump) he saw his prospects for said Nobel price, which he already considered in his pocket, evaporate, which explains his tantrum.


  • in the unable-to-reason-effectively sense

    That’s all LLMs by definition.

    They’re probabilistic text generators, not AI. They’re fundamentally incapable of reasoning in any way, shape or form.

    They just take a text and produce the most probable word to follow it according to their training model, that’s all.

    What Musk’s plan (using an LLM to regurgitate as much of its model as it can, expunging all references to Musk being a pedophile and whatnot from the resulting garbage, adding some racism and disinformation for good measure, and training a new model exclusively on that slop) will produce is a significantly more limited and prone to hallucinations model that occasionally spews racism and disinformation.