Uh, depends on your hardware and model, but probably TabbyAPI?
Uh, depends on your hardware and model, but probably TabbyAPI?
Text-generation-webui is cool, but also kinda crufty. Honestly a lot of the stuff is holdovers from what’s now ancient history in LLM land, and it has (for me) major performance issues at longer context.
I have an old Lenovo laptop with an NVIDIA graphics card.
@Maroon@lemmy.world The biggest question I have for you is what graphics card, but generally speaking this is… less than ideal.
To answer your question, Open Web UI is the new hotness: https://github.com/open-webui/open-webui
I personally use exui for a lot of my LLM work, but that’s because I’m an uber minimalist.
And on your setup, I would host the best model you can on kobold.cpp or the built-in llama.cpp server (just not Ollama) and use Open Web UI as your front end. You can also use llama.cpp to host an embeddings model for RAG, if you wish.
This is a general ranking of the “best” models for document answering and summarization: https://huggingface.co/spaces/vectara/Hallucination-evaluation-leaderboard
…But generally, I prefer to not mess with RAG retrieval and just slap the context I want into the LLM myself, and for this, the performance of your machine is kind of critical (depending on just how much “context” you want it to cover). I know this is !selfhosted, but once you get your setup dialed in, you may consider making calls to an API like Groq, Cerebras or whatever, or even renting a Runpod GPU instance if that’s in your time/money budget.
Sam is actually a liar though.
Everyone in open source AI has been calling him a snake ever since llama1 came out. If you want a more authoritative source, look to the CEO of huggingface, oldschool AI researchers and such.
A letter seen by Reuters, sent by Vivaldi, Waterfox, and Wavebox, and supported by a group of web developers, also supports Opera’s move to take the EC to court over its decision to exclude Microsoft Edge from being subject to the Digital Markets Act (DMA).
OK…
Shouldn’t they be fighting Chrome, more than anything? Surely there’s a legal avenue for that, though I guess there’s a risk of getting deprioritized by Google and basically disappearing.
I somehow didnt’ get a notification for its post, but thats a terrible idea lol.
We already have AI horde, and it has nothing to do with blockchain. We also have APIs and GPU services… that have nothing to do with blockchain, and have no need for blockchain.
Someone apparently already tried the scheme you are describing, and absolutely no one in the wider AI community uses it.
This is true for sooo many games, especially CPU heavy simulation games.
As long as devs officially support and test the Proton version, I don’t have a problem with it. Sure it seems convoluted… but it’s also a hundred times simpler for the dev, and I don’t think the linux community should shame them for it.
Twitter screenshot of this linked in slack that evening.
The modern internet in a nutshell, lol.
The movement to X isn’t universal, it’s more of a “last resort” where a few communities flounder.
Discord is the dominant destination though.
Where’s a Johnny cab when you need it
Or a Delamain.
I would only use the open source models anyway, but it just seems rather silly from what I can tell.
I feel like the last few months have been an inflection point, at least for me. Qwen 2.5, and the new Command-R, really make a 24GB GPU feel “dumb, but smart,” useful enough so I pretty much always keep Qwen 32B loaded on the desktop for its sheer utility.
It’s still in the realm of enthusiast hardware (aka a used 3090), but hopefully that’s about to be shaken up with bitnet and some stuff from AMD/Intel.
Altman is literally a vampire though, and thankfully I think he’s going to burn OpenAI to the ground.
Discord is even worse, as you need to find an invite to a specific Discord, and sometimes go through a lengthy sign up process for each Discord.
Some won’t let you sign up without a phone #.
Matrix.
And… Lemmy.
It doesn’t matter though, the problem is the critical mass is migrating to Discord and shunting everything out of view. Honestly that’s much worse than being on Reddit, even now.
I’m a bit salty this was apparently announced through Discord. Was it even posted anywhere else?
The future of social media is fragmented siloes, I guess.
Ideally they would subscribe and then watch a different service.
Thats so cynical and self defeating. “They’ll use our competition and save us money.” But you’re not wrong, they could totally be thinking that rofl.
Or maybe it’s a retroactive contract negotiation tactic. Basically negotiate or you won’t get any residuals.
Very possible. I guess all that is even more behind-the-curtain than cable, as when shows disappear there is no reason given, no “protest” like some channels will do.
I feel like streaming has made all this stuff even more opaque.
But… wouldn’t that “take up” the views of other shows?
In other words, the consumer has X amount of time for HBO Max, so they’d either be paying for more views on another show or a potential lost subscriber who can’t find anything to watch, right?
And, with all due respect, there’s no way cartoons are more expensive-per-view than other shows.
Israel’s “Special Military Operation”
Even Russia learned their lesson in this part of the world.
I mean, as long as the bots click on ads, everyone is happy? Riiight?
Yes, but this is an offline game, and I’ve never seen such a warning without some plausible justification. There’s no basis for interfering with an online component here, so what would Larian even say as they sent warnings?
Using an legally purchased offline game “illegally” would be quite a precedent, no?
My guess is that it won’t get shut down because WOTC can’t make Larian bully people into shutting it down.
Yeah. Larian didn’t seem very interested in blocking this capability (they left all this stuff in the executable), like they did the absolute minimum they were contractually obligated to do lol.
Plot twist for me:
The mother is also ADD.