If you have no ram it’s just best to crash the OS than making it slow down to a crawl or freeze.
That very much depends on what you’re doing. Even with rather random access patterns (e.g. compiling) swapping out doesn’t crash performance terribly, mostly because my RAM isn’t exactly under-dimensioned (I used the rule of thumb “one gig per hardware thread, round up”). For more regular access patterns such as merging SDXL models (which definitely eats all my 16G) the impact is even less. For, dunno, highly complex and irregular datalog queries over a database four times as large as your RAM – yeah that won’t be nearly as fast.
What you also want to do (under linux) is enable the earlyoom daemon. Those freezes are Linux being way too nice and not killing processes until literally every cached and buffered thing has been purged, also heavily-used ones.
Then you’ll see behaviour such as switching tabs in the browser, or bringing up a minimised terminal or something actually taking a second or two because they got swapped out. But it’s nowhere close to unusable and, due to a 3G/s SSD, a way better experience than in the 90s with a couple megabytes of RAM and swapping to a glorified flywheel.
I could by more RAM, DDR4 prices have pretty much tanked after all, OTOH I swap out like a couple of minutes every other month. Not worth it.
While an interesting read, my comment isn’t directed at advanced users that know their stuff. Also, not everyone own a beast, most users have a medium or low tier PC. Funnily, 16GB of RAM for your usage seems low. Double that and you’ll never need swap unless you want to play with the biggest local LLMs via Llama.ccp. But then, you likely have a GPU beast.
Nah I’m only dabbling around with SDXL, I have a 4G RX5500. Four years ago a Ryzen 3600 was certainly respectable (and slightly more expensive than the GPU) but it’s nowhere near high end, it was slam dunk in the middle of the price/performance optimum and is generally sufficient for my workloads.
I don’t think I’ll upgrade this box (short of an SSD or such) before either GPU prices are sane again, and/or CPUs actually become noticeably faster. All the AM4 ones certainly don’t really seem to be worth it. See I’m an old fart millennial, I’m used to a “two years later get twice the performance at half the price” kind of cadence and the two years have been steadily getting longer.
Sadly, i don’t think GPU prices will ever drop. The only reason would be if a new competitor comes up with something on par or better than the current tech. For CPUs, we’ve reached the limit already. Nonetheless, their’s still some hope with a tech called optical computing.
Computer power increase became meaningless because companies juste use it as an excuse to not optimize their software anymore. The best exemple is how everyone basically uses a whole browser (chromium ) just to show some GUI. The steam client and discord are two big software that comes in mind using this but it’s spreading fast and I found a GUI for aria2c that weights more than 60mb while aria2c itself is a few megs, but even worse some manufacturers are using it for their mouse and keyboard drivers for god sake.
That very much depends on what you’re doing. Even with rather random access patterns (e.g. compiling) swapping out doesn’t crash performance terribly, mostly because my RAM isn’t exactly under-dimensioned (I used the rule of thumb “one gig per hardware thread, round up”). For more regular access patterns such as merging SDXL models (which definitely eats all my 16G) the impact is even less. For, dunno, highly complex and irregular datalog queries over a database four times as large as your RAM – yeah that won’t be nearly as fast.
What you also want to do (under linux) is enable the earlyoom daemon. Those freezes are Linux being way too nice and not killing processes until literally every cached and buffered thing has been purged, also heavily-used ones.
Then you’ll see behaviour such as switching tabs in the browser, or bringing up a minimised terminal or something actually taking a second or two because they got swapped out. But it’s nowhere close to unusable and, due to a 3G/s SSD, a way better experience than in the 90s with a couple megabytes of RAM and swapping to a glorified flywheel.
I could by more RAM, DDR4 prices have pretty much tanked after all, OTOH I swap out like a couple of minutes every other month. Not worth it.
While an interesting read, my comment isn’t directed at advanced users that know their stuff. Also, not everyone own a beast, most users have a medium or low tier PC. Funnily, 16GB of RAM for your usage seems low. Double that and you’ll never need swap unless you want to play with the biggest local LLMs via Llama.ccp. But then, you likely have a GPU beast.
Nah I’m only dabbling around with SDXL, I have a 4G RX5500. Four years ago a Ryzen 3600 was certainly respectable (and slightly more expensive than the GPU) but it’s nowhere near high end, it was slam dunk in the middle of the price/performance optimum and is generally sufficient for my workloads.
I don’t think I’ll upgrade this box (short of an SSD or such) before either GPU prices are sane again, and/or CPUs actually become noticeably faster. All the AM4 ones certainly don’t really seem to be worth it. See I’m an old fart millennial, I’m used to a “two years later get twice the performance at half the price” kind of cadence and the two years have been steadily getting longer.
Sadly, i don’t think GPU prices will ever drop. The only reason would be if a new competitor comes up with something on par or better than the current tech. For CPUs, we’ve reached the limit already. Nonetheless, their’s still some hope with a tech called optical computing.
Computer power increase became meaningless because companies juste use it as an excuse to not optimize their software anymore. The best exemple is how everyone basically uses a whole browser (chromium ) just to show some GUI. The steam client and discord are two big software that comes in mind using this but it’s spreading fast and I found a GUI for aria2c that weights more than 60mb while aria2c itself is a few megs, but even worse some manufacturers are using it for their mouse and keyboard drivers for god sake.