We already do! And on the cheap! I have a Coral TPU running for presence detection on some security cameras, I’m pretty sure they can run LLMs but I haven’t looked around.
GPT4ALL runs rather well on a 2060 and I would only imagine a lot better on newer hardware
There’s a game called Suck Up that is basically that, you play as a vampire that needs to trick AI-powered NPCs into inviting you inside their house.
That sounds amazing - OMW to check it out!
Now THAT is the AI innovation I’m here for
LLMs are in a position to make boring NPCs much better.
Once they can be run locally at a good speed it’ll be a game changer.
I reckon we’ll start getting AI cards for computers soon.
We already do! And on the cheap! I have a Coral TPU running for presence detection on some security cameras, I’m pretty sure they can run LLMs but I haven’t looked around.
GPT4ALL runs rather well on a 2060 and I would only imagine a lot better on newer hardware
that sounds so cool ngl, finally an actually good use for ai