Cargo is a pretty good tool, but it’ll happily fill up your disk with cached copies of crates that you downloaded somewhere in your user folder. Using a modern fs like BTRFS with extent deduplication helps to save space, but it doesn’t solve the problems.
pip and npm are practically equally bad, at least the maintained versions. If you’re using old versions that haven’t been supported for years (Python 2 etc.) then you’re in for a world of pain with both.
There are much worse alternatives. Anaconda, for example, takes half an hour to resolve dependencies while also autoloading itself into your shell, adding up to half a second (I timed this! I thought zsh was bugged!) of latency between shell prompts.
The worst tool is probably what many C(++) projects seem to do: use Git as a dependency manager by including entire git repos as submodules. I’m pretty annoyed at having to keep multiple versions of tokio around when building a Rust project, but at least I’m not downloading the entire commit history for boost every time I clone a project!
Meh, what else is it supposed to do, delete sources all the time? Then people with slow connections will complain.
Also size-wise that’s actually not even much (though they could take the care to compress it), what actually takes up space with rust is compile artifacts, per workspace. Have you heard of kondo?
You can globally share compile artifacts by setting a global target directory in the global Cargo config.
In $HOME/.cargo/config.toml:
[build]
target-dir = "/path/to/dir"
The only problems I had when I did it where some cargo plugins and some dependencies with build.rs files that expected the target folder in it’s usual location.
I actually vastly prefer this behavior. It allows me to jump to (readable) source in library code easily in my editor, as well as experiment with different package versions without having to redownload, and (sort of) work offline too. I guess, I don’t really know what it would do otherwise. I think Rust requires you to have the complete library source code for everything you’re using regardless.
I suppose it could act like NPM, and keep a separate copy of every library for every single project on my system, but that’s even less efficient. Yes, I think NPM only downloads the “built” files (if the package uses a build system & is properly configured), but it’s still just minified JS source code most of the time.
Cargo is a pretty good tool, but it’ll happily fill up your disk with cached copies of crates that you downloaded somewhere in your user folder. Using a modern fs like BTRFS with extent deduplication helps to save space, but it doesn’t solve the problems.
pip and npm are practically equally bad, at least the maintained versions. If you’re using old versions that haven’t been supported for years (Python 2 etc.) then you’re in for a world of pain with both.
There are much worse alternatives. Anaconda, for example, takes half an hour to resolve dependencies while also autoloading itself into your shell, adding up to half a second (I timed this! I thought zsh was bugged!) of latency between shell prompts.
The worst tool is probably what many C(++) projects seem to do: use Git as a dependency manager by including entire git repos as submodules. I’m pretty annoyed at having to keep multiple versions of tokio around when building a Rust project, but at least I’m not downloading the entire commit history for boost every time I clone a project!
Meh, what else is it supposed to do, delete sources all the time? Then people with slow connections will complain.
Also size-wise that’s actually not even much (though they could take the care to compress it), what actually takes up space with rust is compile artifacts, per workspace. Have you heard of kondo?
Idk, maybe you can share the common packages across projects. (That can never go wrong, right? /s)
Sources are shared, sharing compile-time artefacts is done within workspaces.
Oh… I did mean sharing comptime artifacts
You can globally share compile artifacts by setting a global target directory in the global Cargo config.
In $HOME/.cargo/config.toml:
The only problems I had when I did it where some cargo plugins and some dependencies with build.rs files that expected the target folder in it’s usual location.
I actually vastly prefer this behavior. It allows me to jump to (readable) source in library code easily in my editor, as well as experiment with different package versions without having to redownload, and (sort of) work offline too. I guess, I don’t really know what it would do otherwise. I think Rust requires you to have the complete library source code for everything you’re using regardless.
I suppose it could act like NPM, and keep a separate copy of every library for every single project on my system, but that’s even less efficient. Yes, I think NPM only downloads the “built” files (if the package uses a build system & is properly configured), but it’s still just minified JS source code most of the time.
With python and virtualenv you can also keep the entire source of your libraries in your project.