Phoenix3875@lemmy.world to Programmer Humor@programming.dev · 1 month agoWork of pure human soul (and pure human sweat, and pure human tears)lemmy.worldimagemessage-square46fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1imageWork of pure human soul (and pure human sweat, and pure human tears)lemmy.worldPhoenix3875@lemmy.world to Programmer Humor@programming.dev · 1 month agomessage-square46fedilink
minus-squarepassepartout@feddit.orglinkfedilinkarrow-up0·1 month agoIf you have a supported GPU you could try Ollama (with openwebui), works like a charm.
minus-squarebi_tux@lemmy.worldlinkfedilinkarrow-up0·1 month agoyou don’t even need a supported gpu, I run ollama on my rx 6700 xt
minus-squareBaroqueInMind@lemmy.onelinkfedilinkarrow-up0·1 month agoYou don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
minus-squarebi_tux@lemmy.worldlinkfedilinkarrow-up0·1 month agoI tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)
If you have a supported GPU you could try Ollama (with openwebui), works like a charm.
you don’t even need a supported gpu, I run ollama on my rx 6700 xt
You don’t even need a GPU, i can run Ollama Open-WebUI on my CPU with an 8B model fast af
I tried it with my cpu (with llama 3.0 7B), but unfortunately it ran really slow (I got a ryzen 5700x)