• chicken@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Same, at least for anything but the tiny ones that will fit in my limited vram. Hoping a gpu that’s actually good for LLM will come out in the next few years that’s not 15k and made for servers.