• 1 Post
  • 149 Comments
Joined 1 year ago
cake
Cake day: July 5th, 2024

help-circle
  • For the longest time I refused to watch the Halo show because I heard that Master Chief takes off his helmet. But then I gave it a shot and it’s a really really good show, and they did the adaptation solid justice.

    They made changes where it (mostly) made sense and were truthful to everything else.

    They set up a back story that explains how we got a John-117 in the games. Someone who is socially reserved, doesn’t talk much, never takes off his helmet, and prefers to work alone. The ending of the second season was a setup for season 3 to start exactly where Halo 1 started.

    The music was phenomenal, cinematography was on point, acting was great, story line was compelling.

    I’m normally the person who’s a stickler for not changing a story at all, but the Halo universe was originally told through a game that was more about story beats than actual literary writing. So there’s a ton of room for the in-between conversations and events.

    I think the show got an undeserved bad rap. If more people gave it a chance they may have actually liked it.

    Halo fans got an actually decent show. Whereas Wheel of Time and Tolkien fans got the abominations of a show we got.





  • CeeBee_Eh@lemmy.worldtoSelfhosted@lemmy.worldgoodbye plex
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Not really useless, it’s an extra layer of management (a good thing). The Proxmox system can be nearly static while giving you external level management of the OS that manages the containers.

    I have a 3 server Proxmox cluster running various VMs doing different things. Some of those VMs are my container systems.

    Besides, you can run containers directly on Proxmox itself.



  • CeeBee_Eh@lemmy.worldtoSelfhosted@lemmy.worldgoodbye plex
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 days ago

    Sure, ZFS snapshots are dead simple and fast. But you’d need to ensure that each container and its volumes are created in each respective dataset.

    And none of this is implying that it’s hard. The top comment was criticizing OP for using VMs instead of containers. Neither one is better than the other for all use cases.

    I have a ton of VMs for various use cases, and some of those VMs are container/Docker hosts. Each tool where it works best.



  • CeeBee_Eh@lemmy.worldtoSelfhosted@lemmy.worldgoodbye plex
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    It’s not the same. You then need to manage volumes separately from images, or if you’re mounting a host folder for the Jellyfin files then you have to manage those separately via the host.

    Container images are supposed to be stateless. So then if you’re only banking up the volumes, then you need to somehow track which Jellyfin version it’s tied to, in case you run into any issues.

    A VM is literally all of that but in a much more complete package.


  • CeeBee_Eh@lemmy.worldtoSelfhosted@lemmy.worldgoodbye plex
    link
    fedilink
    English
    arrow-up
    5
    ·
    2 days ago

    I can backup an entire VM snapshot very quickly and then restore it in a matter of minutes. Everything from the system files, database, Jellyfin version and configs, etc. All easily backed up and restored in an easy to manage bundle.

    A container is not as easy to manage in the same way.








  • I’m not wrong. There’s mountains of research demonstrating that LLMs encode contextual relationships between words during training.

    There’s so much more happening beyond “predicting the next word”. This is one of those unfortunate “dumbing down the science communication” things. It was said once and now it’s just repeated non-stop.

    If you really want a better understanding, watch this video:

    https://youtu.be/UKcWu1l_UNw

    And before your next response starts with “but Apple…”

    Their paper has had many holes poked into it already. Also, it’s not a coincidence their paper released just before their WWDC event which had almost zero AI stuff in it. They flopped so hard on AI that they even have class action lawsuits against them for their false advertising. In fact, it turns out that a lot of their AI demos from last year were completely fabricated and didn’t exist as a product when they announced them. Even some top Apple people only learned of those features during the announcements.

    Apple’s paper on LLMs is completely biased in their favour.