I discovered uv thanks to an article of yours and I am so grateful for it. It enabled proper dependency management for beginners and helped so much in learning about the Python packaging ecosystem.
If you depend on setuptools behaviour e.g. building of cython modules, then it becomes a pain to move away from https://setup.py. So more of an indirect issue
I just finished getting my "deep learning stack" working in a consistent way with all the projects I need (it took some work). And by the time I finally get it working there is a new python package manager????????? (uv looks cool)
I'm historically using poetry in professional project and can't spend time to migrate them. Rephrasing: I can't charge my customers for an enhancement they really can't see the benefits. On the other hand I use ruff in every of them.
I saw mentions of it for a while before finally reading about it. It sounds great, but Pipenv has been working well enough for me. I'll probably check it out once my current project is more mature. I don't feel like throwing another new, unfamiliar thing into the mix right now.
In my case, it's "why aren't I solely using uv", which is because I haven't gone through the effort of migrating my poetry-powered repos yet. But all my Net New stuff is.
In the company: I need to figure out a way to mirror the standalone python builds (do I?). And at work I'm always three times more conservative before I jump on a hype train, even though this time I am fully convinced there is no risk - only benefits
Python is a profound mess suddenly. Used UV yesterday to try to get browser-use working and it ended up installing two sets of everything. I'll probably just use devcontainers.
The long history of predecessors makes me sceptical on the N+1st
I've tried uv and like it so far but I also manage with just pip, virtual, and the like. Speed is wonderful but so is not having to migrate if N+2 ends up winning.
after looking, the speedup is worth trying it out!
Outside of that tt doesn't offer any solutions to problems I've hit. virtualenv and requirements handle my python versions and lockfile. Things vanishing from pypi or unsolvable dependencies in large projects are more common
That only is an issue if you use dynamic metadata. You can force it to reinstall all the time or you stop doing dynamic metadata (which I recommend!!!)
Mercurial uses metadata (for its version) and I cannot change that. I guess I can force UV to do something reasonable but PDM is very good for this application. I don't see real advantages of UV in this case.
I strongly recommend manually bumping versions and use static metadata. Python is an odd ecosystem where dynamic metadata is a thing at all. The real advantages of UV are the faster resolver times over PDM.
Comments
But I have to say the dependency caching has me sold. Most of the time in my cycle I do `uv run
However I also have a reliable workflow with pyenv+pip for local, and container+pip for deploy.
Worked on a project with rye, it was fine.
Or better yet use `juv` and define the notebook package requirements in the notebook!
https://github.com/manzt/juv
- it doesn't read the pip.conf that we've rolled out everywhere to use our internal pypi mirror
- because you need to install it before you can use it (in contrast to pip, which is part of the python distribution)
I've tried uv and like it so far but I also manage with just pip, virtual, and the like. Speed is wonderful but so is not having to migrate if N+2 ends up winning.
Outside of that tt doesn't offer any solutions to problems I've hit. virtualenv and requirements handle my python versions and lockfile. Things vanishing from pypi or unsolvable dependencies in large projects are more common
https://github.com/SadeghPouriyanZadeh/dev-env-setup
Like Poetry, I don’t see the point. Setuptools is good enough for me, I guess.
Very easy transition from Poetry.