The Picard Maneuver@piefed.world to Microblog Memes@lemmy.worldEnglish · 18 hours agoDid you know GIMP could do that instead?media.piefed.worldimagemessage-square46fedilinkarrow-up1444arrow-down120
arrow-up1424arrow-down1imageDid you know GIMP could do that instead?media.piefed.worldThe Picard Maneuver@piefed.world to Microblog Memes@lemmy.worldEnglish · 18 hours agomessage-square46fedilink
minus-squareLazycog@sopuli.xyzlinkfedilinkEnglisharrow-up52·18 hours agoWhere can I find open source friends? I’d like to compile them myself
minus-squareMidnight Wolf@lemmy.worldlinkfedilinkEnglisharrow-up4·11 hours agoI’m already available as an executable, a few known bugs but they won’t accept pull requests for patches on oneself, something about eternal life or something. I mean uhhhhhh beep boop.
minus-squaresp3ctr4l@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up1·edit-29 hours agoOpenLlama. Alpaca. Run a local friend model, today! I… actually futzed around with this just to see if it would work, and… yeah, there actually are models that will run on a Steam Deck, with Bazzite. EDIT: Got my angry spitting long necked quadrupeds mixed up. Alpaca is a flatpak, literally could not be easier to set up a local LLM.
Where can I find open source friends? I’d like to compile them myself
You can fork them too 😏
I’m already available as an executable, a few known bugs but they won’t accept pull requests for patches on oneself, something about eternal life or something.
I mean uhhhhhh beep boop.
Ah neat, so I can just curl you?
🥺👉👈
Calm down Dr. Frankenstein
OpenLlama. Alpaca.Run a local friend model, today!
I… actually futzed around with this just to see if it would work, and… yeah, there actually are models that will run on a Steam Deck, with Bazzite.
EDIT: Got my angry spitting long necked quadrupeds mixed up.
Alpaca is a flatpak, literally could not be easier to set up a local LLM.