ChatGPT suffered a worldwide outage from 06:36 UTC Tuesday morning. The servers weren’t totally down, but queries kept returning errors. OpenAI finally got it mostly fixed later in the day. [OpenAI…
4GB card can run smol models, bigger ones require an nvidia and lots of system RAM, and performance will be proportionally worse by VRAM / DRAM usage balance.
So 12GB is what you need?
Asking because my 4GB card clearly doesn’t cut it 🙍🏼♀️
4GB card can run smol models, bigger ones require an nvidia and lots of system RAM, and performance will be proportionally worse by VRAM / DRAM usage balance.
Big models work great on macbooks or AMD GPUs or AMD APUs with unified memory