• DarkThoughts@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    I could see use for local text gen, but that apparently eats quite a bit more than what desktop PCs could offer if you want to have some actually good results & speed. Generally though, I’d rather want separate extension cards for this. Making it part of other processors is just going to increase their price, even for those who have no use for it.

    • BlackLaZoR@kbin.run
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      There are local models for text gen - not as good as chatGPT but at the same time they’re uncensored - so it may or may not be useful