• samus7070@programming.dev
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      The only reason I can think of is for more on device ai. LLMs like ChatGPT are extremely greedy when it comes down to RAM. There are some optimizations that squeeze them into a smaller memory footprint at the expense of accuracy/capability. Even some of the best phones out there today are barely capable of running a stripped down generative ai. When they do, the output is nowhere near as good as when it is run in an uncompressed mode on a server.

    • ThankYouVeryMuch@kbin.social
      link
      fedilink
      arrow-up
      11
      arrow-down
      5
      ·
      1 year ago

      For the user? Not at all. For the companies that want their spying/tracking apps to run and take your precious data 24/7? Yes, this way dozens of apps can track you even if you open a hundred more afterwards and forget about them, they can live forever deep down those 24gb

    • OfficerBribe@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      It will allow future developers to create even less optimized apps and not worry about how resources are used.