by harryvederci on 3/9/25, 9:23 AM with 231 comments
by quickslowdown on 3/12/25, 9:18 PM
I've used 'em all, pip + virtualenv, conda (and all its variants), Poetry, PDM (my personal favorite before switching to uv). Uv handles everything I need in a way that makes it so I don't have to reach for other tools, or really even think about what uv is doing. It just works, and it works great.
I even use it for small scripts. You can run "uv init --script <script_name.py>" and then "uv add package1 package2 package3 --script <script_name.py>". This adds an oddly formatted comment to the top of the script and instructs uv which packages to install when you run it. The first time you run "uv run <script_name.py>," uv installs everything you need and executes the script. Subsequent executions use the cached dependencies so it starts immediately.
If you're going to ask me to pitch you on why it's better than your current preference, I'm not going to do that. Uv is very easy to install & test, I really recommend giving it a try on your next script or pet project!
by IshKebab on 3/12/25, 10:17 PM
I think all the other projects (pyenv, poetry, pip, etc.) should voluntarily retire for the good of Python. If everyone moved to Uv right now, Python would be in a far better place. I'm serious. (It's not going to happen though because the Python community has no taste.)
The only very minor issue I've had is once or twice the package cache invalidation hasn't worked correctly and `uv pip install` installed an outdated package until I `uv clean`ed. Not a big deal though considering it solves so many Python clusterfucks.
by kubav027 on 3/12/25, 10:07 PM
I have already setup CI/CD pipelines for programs and python libraries. Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD. I do not think it is worth the time right now.
But if you use older environments without proper lock file I would recommend switching immediately. Poetry v2 supports pyproject.toml close to format used by uv so I can switch anytime when it would look more appealing.
Another thing to consider in long term is how astral tooling would change when they will need to make money.
by kylecordes on 3/12/25, 11:01 PM
by TheIronYuppie on 3/13/25, 12:37 AM
E.g.:
#!/usr/bin/env python3
# /// script
# requires-python = ">=3.11"
# dependencies = [
# "psycopg2-binary",
# "pyyaml",
# ]
# ///
Then - uv run -s file.py
by runjake on 3/13/25, 12:48 AM
by vslira on 3/12/25, 9:52 PM
Uv makes python go from "batteries included" to "attached to a nuclear reactor"
by selimnairb on 3/13/25, 11:17 AM
In what I’ve done, I’ve never found things like pipenv, let alone uv, to be necessary. Am I missing something? What would uv get?
by mrbonner on 3/12/25, 10:51 PM
by BiteCode_dev on 3/12/25, 10:07 PM
It replaces a whole stack, and does each feature better, faster, with fewer modes of failure.
by rsyring on 3/12/25, 9:13 PM
I also use mise with it, which is a great combination and gives you automatic venv activation among other things.
See, among other mise docs related to Python, https://mise.jdx.dev/mise-cookbook/python.html
See also a Python project template I maintain built on mise + uv: https://github.com/level12/coppy
by jillesvangurp on 3/13/25, 8:34 AM
It sounds like uv should replace the combination. Of course there is the risk of this being another case of the python community ritually moving the problem every few years without properly solving it. But it sounds like uv is mostly doing the right thing; which is making global package installation the exception rather than the default. Most stuff you install should be for the project only unless you tell it otherwise.
Will give this a try next time I need to do some python stuff.
by unsnap_biceps on 3/12/25, 8:42 PM
by eikenberry on 3/12/25, 9:28 PM
by pzo on 3/13/25, 4:48 AM
by oblio on 3/12/25, 10:23 PM
by xenophonf on 3/13/25, 2:45 AM
Right now, the only thing I really want is dependency pinning in wheels but not pyproject.yaml, so I can pip install the source and get the latest and greatest, or I can pip install a wheel and get the frozen dependencies I used to build the wheel. Right now, if I want the second case, I have to publish the requirements.txt file and add the wheel to it, which works but is kind of awkward.
by tomrod on 3/12/25, 10:45 PM
by bnycum on 3/12/25, 9:00 PM
by xucian on 3/9/25, 1:31 PM
I'm very comfortable with pyenv, but am extremely open to new stuff
by bigfatfrock on 3/13/25, 2:41 AM
IMO no really hard problem is ever truly solved but as can be seen in other comments, this group of people really crushed the pain of me and *many* others, so bravo alone on that - you have truly done humanity a service.
by aequitas on 3/13/25, 8:48 AM
I've tried almost every Python packaging solution under the sun in the past 15 years but they all had their problems. Finally I just stuck with pip/pip-tools and plain venv's but strung together with a complicated Makefile to optimize the entire workflow for iteration speed (rebuilding .txt files when .in requirements changes, rebuilding venv if requirements change, etc). I've been able to reduce it to basically one Make target calling uv to do it all.
by ashvardanian on 3/12/25, 10:38 PM
At this point, just thinking about updating CIBuildWheel images triggers PTSD—the GitHub CI pipelines become unbearably slow, even for raw CPython bindings that don’t require LibC or PyBind11. It’s especially frustrating because Python is arguably the ultimate glue language for native libraries. If Astral’s tooling could streamline this part of the workflow, I think we’d see a significant boost in the pace of both development & adoption for native and hardware-accelerated tools.
by surfingdino on 3/13/25, 12:00 AM
by globular-toast on 3/13/25, 8:00 AM
by zahlman on 3/13/25, 3:28 AM
FWIW, I was able to confirm that the listed primary dependencies account for everything in the `pip freeze` list. (Initially, `userpath` and `pyrsistent` were missing, but they appeared after pinning back the versions of other dependencies. The only project for which I couldn't get a wheel was `python-hglib`, which turned out to be pure Python with a relatively straightforward `setup.py`.)
by mafro on 3/13/25, 10:24 AM
Has anyone used both hatch and uv, and could comment on that comparison?
EDIT: quick google gives me these opinions[1]
[1]: https://www.reddit.com/r/Python/comments/1gaz3tm/hatch_or_uv...
by jgalt212 on 3/12/25, 11:50 PM
by lmeyerov on 3/12/25, 11:33 PM
In the data science world, conda/mamba was needed because of this kind of thing, but a lot of room for improvement. We basically want lockfile, incremental+fast builds, and multi-arch for these tricky deps.
by theogravity on 3/12/25, 8:32 PM
I just want to create a monorepo with python that's purely for making libraries (no server / apps).
And is it normal to have a venv for each library package you're building in a uv monorepo?
by o10449366 on 3/12/25, 11:58 PM
by OutOfHere on 3/12/25, 10:23 PM
by 77ko on 3/13/25, 2:54 AM
Which would fit in with existing uv commands[1] like `uv add plotly`.
There is an exisiting `uv lock --upgrade-package requests` but this feels a bit verbose.
[1]: https://docs.astral.sh/uv/guides/projects/#creating-a-new-pr...
by randomsolutions on 3/13/25, 12:37 PM
by stuaxo on 3/13/25, 1:16 AM
I'm still very keen on virtualenvwrapper, I hope that the fast dependency resolution and install of uv can come there and to poetry.
by BewareTheYiga on 3/14/25, 12:58 AM
by moltar on 3/12/25, 10:11 PM
by whimsicalism on 3/13/25, 3:36 AM
by mrlatinos on 3/13/25, 3:38 AM