• Honytawk@lemmy.zip
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago
      • The ones who have investments in AI

      • The ones who listen to the marketing

      • The ones who are big Weird Al fans

      • The ones who didn’t understand the question

    • x0x7@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      3 months ago

      Maybe people doing AI development who want the option of running local models.

      But baking AI into all consumer hardware is dumb. Very few want it. saas AI is a thing. To the degree saas AI doesn’t offer the privacy of local AI, networked local AI on devices you don’t fully control offers even less. So it makes no sense for people who value convenience. It offers no value for people who want privacy. It only offers value to people doing software development who need more playground options, and I can go buy a graphics card myself thank you very much.

    • barfplanet@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      3 months ago

      I’m interested in hardware that can better run local models. Right now the best bet is a GPU, but I’d be interested in a laptop with dedicated chips for AI that would work with pytorch. I’m a novice but I know it takes forever on my current laptop.

      Not interested in running copilot better though.