I’d be interested about the second part in particular. Now that conda is pretty reliable, fast and easy to install, I’m not convinced about using yet another tool that will again confuse end-users, but I’m happy to be convinced by good arguments!
Comments
Log in with your Bluesky account to leave a comment
And @guiwitz.bsky.social I hear you on that. For me, one carrot here is that these are both compiled binaries that have no need for base envs and little ability to muck up other stuff. They are simple installs that can otherwise be ignored (after running a single command) if the end user wants.
They both are so fast at creating environments that they finally make it possible to think of envs as ephemeral, on-demand things. (And long lived envs is a big source of breakage and confusion for many users). So it can remove another footgun from the process
Looking forward to testing out uv shortly. Building and sharing conda environments has been a massive hurdle for handing over pipelines to non-computationally savvy researchers
👍 I see these tools as possible hopes for a scheme in which we developers spend a bit more time learning and fixing dependencies, and then end users just “ run somescript”
Which can fully spin up a working env (including python itself) and run. No more “first create an env Then…”
Yeah I think containerization is excellent for a subset of problems (particularly when you need to communicate across multiple un-mutually-solveable envs). But I agree it puts a bit more burden and complexity on the end user. These env builders are a bit “closer to the metal” I suppose.
I think this sounds too good to be true so that I haven’t tested those tools in fear of still breaking something 😅 I’ll definitely try now! Thanks for the clarifications!
Basically, if you've ever been in a workflow where you clone a git repo and then run notebooks, scripts, or tests etc. from it, then uv or pixi project management can make a ton of sense.
I still have the impression this is more for powerusers and doesn’t solve very common issues. For example, there’s sort of the claim that it solves complex installs for any platform. But the given example of JAX+CUDA would still not run on a Mac (unless there’s some magic I don’t understand).
It's actually even more strict than conda, I had issues with an environment because pixi complained about incompatibilies of dependencies while conda just did it...
The big thing is lock files.
Last year I helped with a workshop. We had a conda env.yml or req.txt. The night before, a transient dependency was updated. It broke something in the env. We realized the issue quickly, but pushing a new yml, getting word out, was 😬
A project lock file solves that.
But I’ll give it a try in my next projects and see how it turns out! As almost zero popular package/course has instructions for this, I’ll however definitely wait to use this for teaching…
I'm an enthusiastic end-user, investing time to stay up to date on image analysis... Spreading the word to my colleagues works only because I install everything for them... And give them assistance to use the software... We cannot change at that pace unless we provide web top apps
I hear ya.
But, new tools do make life easier. I think project-based workflows are more intuitive if you are sharing notebooks/scripts.
Both uv & pixi are small simple executables: easy to install & if you never use it again, it's no harm.
But, any transition to new tooling will have friction.
The main points I took away: 1. project-oriented tools make sense for things like scripts and notebooks, 2. for reproducibility non-libraries should use lock-files or similar to prevent updates to transitive dependencies breaking things even if you pin direct dependencies.
But yes, introducing new tools is always a trade-off. If some workshops use uv, some pixi, and some miniforge you start to get a lot of needless cognitive load. That said, I've seen an end user with 3 different conda envs on their prompt:
(foo)(bar)(baz) $
Yup! If you want to stick with pip then pip-tools is the way.
uv replaces pip & pip-tools. Pixi uses uv for pypi dependencies.
For classic conda you have conda-lock.
Yeah to be clear my point wasn’t specifically to use one of those two tools. It was to emphasize the under appreciated problem of transitive dependencies and encourage explicit awareness.
uv and pixi are great all-in-one tools that are both faster than all the alternatives in their domains (PyPI and conda respectively), and also control an (all important) lock file. So they’re not fundamentally doing something no one has done, I just think they’re doing it better.
From an end-user perspective I find pixi very simple, especially if one uses pixi-tasks. A standard execution pipeline now boils down to:
1. WD=runs/dataset pixi run build_config
2. WD=runs/dataset pixi run workflow
3. WD=runs/dataset pixi run visualization
Comments
Which can fully spin up a working env (including python itself) and run. No more “first create an env Then…”
We're getting there bit by bit
Basically, if you've ever been in a workflow where you clone a git repo and then run notebooks, scripts, or tests etc. from it, then uv or pixi project management can make a ton of sense.
Last year I helped with a workshop. We had a conda env.yml or req.txt. The night before, a transient dependency was updated. It broke something in the env. We realized the issue quickly, but pushing a new yml, getting word out, was 😬
A project lock file solves that.
But, new tools do make life easier. I think project-based workflows are more intuitive if you are sharing notebooks/scripts.
Both uv & pixi are small simple executables: easy to install & if you never use it again, it's no harm.
But, any transition to new tooling will have friction.
(foo)(bar)(baz) $
uv replaces pip & pip-tools. Pixi uses uv for pypi dependencies.
For classic conda you have conda-lock.
1. WD=runs/dataset pixi run build_config
2. WD=runs/dataset pixi run workflow
3. WD=runs/dataset pixi run visualization
No conda env create, conda activate...
https://fmi-faim.github.io/example-project/