• 0 Posts
  • 82 Comments
Joined 2 years ago
cake
Cake day: March 3rd, 2024

help-circle

  • I would love to see the werewolf play the pompous know-it-all: “Um, actually the idea that the moon causes the change is a superstition. It’s a body cycle that often coincidentally matches up with the full moon. People just remember the times during the full moon because of confirmation bias.”


  • I honestly couldn’t get very far because his points were not as clear-cut as he was trying to imply and the tone was confrontational. I have a hard time being told I’m wrong on a matter of personal preference that is individually configurable , and where my choices have no impact on others’ experience.

    If he’s venting about his own experience, because the most common choices, which are defaults, don’t match his preferences, go right ahead. But don’t phrase it like anyone who disagrees with you can be demonstrated as objectively wrong with a few simple examples.


  • the preceding anonymous immediately-invoked function that englobes the entire first code block/sample is now off-screen and the code blurb itself is different…

    That bothered me a lot. Then I noticed in his second snippet, only function names were highlighted. What if I’m reviewing someone else’s code and I’m looking for magic strings/numbers that should be factored out as constants or parameters? The first block already has literal values a distinct color; does he expect me to change the syntax highlighting settings on my IDE for every task?



  • It depends on your definition of “can”. Are his actions allowed by law? No. Will anyone stop Trump from doing them anyway? Probably not.

    I also want to make clear, these aren’t “Democrat agencies.” There aren’t formally “Democrat” and “Republican” agencies in the federal government. National political parties are formally private organizations, and local political parties are affiliated with national parties with various levels of control able to be exerted on the local parties by the national parties depending on the specific organizations involved and their relationships. It’s all complicated, but the salient point is it’s all non-governmental. The agencies Trump is cutting funding from are governmental agencies that generally have greater approval/support from segments of the voting populace that generally lean more Democrat in their voting behavior. There are Democrats that don’t support these agencies, and there are Republicans that do. There are also likely people in both parties that support the general cause of the agencies but would prefer they would be run differently or have different policies or regulations. Again, in reality it’s complicated and nuanced.

    Calling them “Democrat agencies” is Trump applying tribalistic language in his usual divisive way to drum up support from his base. The voting populations that broadly support these agencies generally lean Democrat, but that’s not catchy and won’t get people angry and vocally in support of Trump. So he calls them “Democrat agencies” to paint a picture that, despite the Republicans having control of literally all branches of the federal government, Democrats directly control these federal agencies (which is not true), and that therefore they are acting against the will of the public, who he represents by definition (which is also not true), and therefore they should be shutdown. It’s right out of the fascist playbook, and when the media even just quotes his language, they enable him to define the language of the discussion of his actions, and thus they further help Trump shape the narrative of the shutdown.

    Nothing in the shutdown gives him the power to do these things. He was in fact doing all of these things before the shutdown, and he had no legal authority to do any of it then either. He’s able to do it because his regime is authoritarian and does whatever they want, and organizations that stand to benefit from this authoritarian regime have spent the last 50+ years systematically subverting the checks and balances that were built into the federal government to prevent this kind of authoritarianism. Complicit politicians in the legislative branch prevent impeachment and removal from office of anyone in the regime that breaks the law, and complicit Supreme Court judges prevent the judicial branch from delivering injunctions or other judicial relief or safeguards from these actions. There are coordinated (even if it’s just stochastic coordination) bad faith actors at all levels of power in all branches and offices of the US government. It didn’t happen over night, it in fact took decades, but no one stopped it, so here we are.

    From the legal definition of “can”, Trump in fact cannot do most of what he’s doing. But in America laws don’t matter anymore, so in practical terms he can do literally anything now.


  • “Users accustomed to receiving confident answers to virtually any question would likely abandon such systems rapidly,” the researcher wrote.

    While there are “established methods for quantifying uncertainty,” AI models could end up requiring “significantly more computation than today’s approach,” he argued, “as they must evaluate multiple possible responses and estimate confidence levels.”

    “For a system processing millions of queries daily, this translates to dramatically higher operational costs,” Xing wrote.

    1. They already require substantially more computation than search engines.
    2. They already cost substantially more than search engines.
    3. Their hallucinations make them unusable for any application beyond novelty.

    If removing hallucinations means Joe Shmoe isn’t interested in asking it questions a search engine could already answer, but it brings even 1% of the capability promised by all the hype, they would finally actually have a product. The good long-term business move is absolutely to remove hallucinations and add uncertainty. Let’s see if any of then actually do it.


  • As far as I’ve ever been paying attention, conservatives only argue in bad faith. It’s always been about elevating their own speech and suppressing speech that counters theirs. They just couch it in terms that sound vaguely reasonable or logical in the moment if you don’t know their history and don’t think about it more deeply than very surface-level.

    Before, platforms were suppressing their speech, so they were promoters of free speech. Now platforms are not suppressing speech counter to them, so it’s all about content moderation to protect the children, or whatever. But their policies always belie their true motive: they never implement what research shows supports their claimed position of the moment. They always create policies that hurt their out-groups and may sometimes help their in-groups (helping people is optional).


  • Can we be so sure such a stock market dip is due to the ongoing daytime TV drama that is AI?

    There’s also the undercurrent of the Trump administration steamrolling over decades- or century-old precedents daily, putting our country, and thus the economy, in new territory. Basic assumptions about the foundations of our economy are crumbling, and the only thing keeping it from collapsing outright is inertia. But inertia will only last so long. This is affecting every aspect of the real economy, goods and services that are moving around right now, as opposed to the speculative facets like the AI bubble.

    I’m waiting for the other shoe to drop and for Wall Street to realize Trump has really screwed over vast swaths of supply chains all across the economy.


  • My understanding of why digital computers rose to dominance was not any superiority in capability but basically just error tolerance. When the intended values can only be “on” or “off,” your circuit can be really poor due to age, wear, or other factors, but if it’s within 40% of the expected “on” or “off” state, it will function basically the same as perfect. Analog computers don’t have anywhere near tolerances like that, which makes them more fragile, expensive, and harder to scale production.

    I’m really curious if the researchers address any of those considerations.


  • Vibe coding anything more complicated than the most trivial example toy app creates a mountain of security vulnerabilities. Every company that fires human software developers and actually deploys applications entirely written by AI will have their systems hacked immediately. They will either close up shop, hire more software security experts than the number of developers they fired just to keep up with the garbage AI-generated code, or try to hire all of the software developers back.




  • Several years ago I created a Slack bot that ran something like Jupyter notebook in a container, and it would execute Python code that you sent to it and respond with the results. It worked in channels you invited it to as well as private messages, and if you edited your message with your code, it would edit its response to always match the latest input. It was a fun exercise to learn the Slack API, as well as create something non-trivial and marginally useful in that Slack environment. I knew the horrible security implications of such a bot, even with the Python environment containerized, and never considered opening it up outside of my own personal use.

    Looks like the AI companies have decided that exact architecture is perfectly safe and secure as long as you obfuscate the input pathway by having to go through a chat-bot. Brilliant.


  • A balloon full of helium has more mass than a balloon without helium, but less weight

    That’s not true. A balloon full of helium has more mass and more weight than a balloon without helium. Weight is dependent only on the mass of the balloon+helium and the mass of the planet (Earth).

    The balloon full of helium displaces way more air than the balloon without helium since it is inflated. The volume of displaced air of the inflated balloon has more weight than the combined weight of the balloon and helium within, so it floats due to buoyancy from the atmosphere. Its weight is the same regardless of the medium it’s in, but the net forces experienced by it are not.



  • ignirtoq@fedia.iotoTechnology@beehaw.orgThe rise of Whatever
    link
    fedilink
    arrow-up
    13
    arrow-down
    1
    ·
    4 months ago

    The thing is it’s been like that forever. Good products made by small- to medium-sized businesses have always attracted buyouts where the new owner basically converts the good reputation of the original into money through cutting corners, laying off critical workers, and other strategies that slowly (or quickly) make the product worse. Eventually the formerly good product gets bad enough there’s space in the market for an entrepreneur to introduce a new good product, and the cycle repeats.

    I think what’s different now is, since this has gone on unabated for 70+ years, economic inequality means the people with good ideas for products can’t afford to become entrepreneurs anymore. The market openings are there, but the people that made everything so bad now have all the money. So the cycle is broken not by good products staying good, but by bad products having no replacements.


  • The technological progress LLMs represent has come to completion. They’re a technological dead end. They have no practical application because of hallucinations, and hallucinations are baked into the very core of how they work. Any further progress will come from experts learning from the successes and failures of LLMs, abandoning them, and building entirely new AI systems.

    AI as a general field is not a dread end, and it will continue to improve. But we’re nowhere near the AGI that tech CEOs are promising LLMs are so close to.


  • Oppenheimer was already really long, and I feel like it portrayed the complexity of the moral struggle Oppenheimer faced pretty well, as well as showing him as the very fallible human being he was. You can’t make a movie that talks about every aspect of such an historical event as the development and use of the first atomic bombs. There’s just too much. It would have to be a documentary, and even then it would be days long. Just because it wasn’t the story James Cameron considers the most compelling/important about the development of the atomic bomb doesn’t mean it’s not a compelling/important story.


  • The first statement is not even wholly true. While training does take more, executing the model (called “inference”) takes much, much more power than non-AI search algorithms, or really any traditional computational algorithm besides bogosort.

    Big Tech weren’t doing the best they possibly could transitioning to green energy, but they were making substantial progress before LLMs exploded on the scene because the value proposition was there: traditional algorithms were efficient enough that the PR gain from doing the green energy transition offset the cost.

    Now Big Tech have for some reason decided that LLMs represent the biggest game of gambling ever. The first to find the breakthrough to AGI will win it all and completely take over all IT markets, so they need to consume as much as they can get away with to maximize the probability that that breakthrough happens by their engineers.