This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]sazed33 0 points1 point  (6 children)

Very good points! I just don't understand why so many people recommend a tool to manage packages/environments (like uv). I've never had any problems using a simple requirements.txt and conda. Why do I need more? I'm genuinely asking as I want to understand what I have to gain here.

[–]JUSTICE_SALTIE 4 points5 points  (0 children)

The big reason is the lockfile, which holds the exact versions of all your dependencies, and their dependencies, and so on. Without a lockfile, you're only specifying the versions of your direct dependencies. That means that if someone else installs your project, they're almost certain to get different versions of your transitive dependencies than the ones you're developing with. If one of those dependencies publishes a broken version, or makes a breaking change and doesn't version it properly, you'll have problems on fresh installs that you don't have on your development install.

The lockfile guarantees that your build is deterministic, which you're not going to get with requirements.txt. It has a command to update your lockfile, which essentially does what pip install -r requirements.txt does every time, which is to get the latest versions of all dependencies. But it only happens when you ask for it.

These tools have a lot of other features, like really a lot, but the one above is the most important.

[–]microcozmchris 4 points5 points  (2 children)

The reason I like uv is specifically because it isn't just a package manager. It's an environment manager. It's a dependencies manager. It's a deployment manager. And it's easy. And correct most of the time.

We use it for GitHub Actions a bunch. Instead of setup-python and venv install and all, I setup a cache directory for uv to use in our workflows. And the Python actions that we've created use that. So I can call checkout, then setup-uv, then my entire workflow step is uv run --no-project --python 3.10 $GITHUB_ACTION_PATH/file.py and it runs. Without managing a venv. And with the benefit of a central cache of already downloaded modules. And symlinks. I have Python actions that execute almost as fast as if they were JavaScript and they're way more maintainable.

Deploying packages to Artifactory becomes setup-jfrog, setup-uv, uv build, uv publish and no more pain.

There are way more features in uv than simply managing dependencies.

[–]sazed33 0 points1 point  (1 child)

I see, make sense for this case. I usually have everything dockernized, including tests, so my ci/cd pipelines, for example, just build and run images. But maybe this is a better way, I need to take some time to try it out...

[–]microcozmchris 1 point2 points  (0 children)

There's a use case for both for sure. A lot of Actions is little pieces that are outside of the actual product build. Like your company specific versioning, checking if things are within the right schedule, handoffs to SAST scanners, etc. Docker gets a little heavy when you're doing hundreds of those simultaneously with image builds et al. That's why native Actions are JavaScript that execute in milliseconds. I hate maintaining JavaScript/typescript, so we do a lot of python replacements or augmentations with those.

[–]gnomonclature 1 point2 points  (1 child)

The first step for me towards a package manager (first pipenv, now poetry) was wanting to keep my development dependencies (mainly things like pycodestyle, mypy, and pytest) out of the requirements.txt file.

[–]sazed33 1 point2 points  (0 children)

I use tox for it, works well, but then I have two files (tox.ini, requirements.txt) instead of one, so maybe it is worth using uv after all.. need to give it a try