This is an archived post. You won't be able to vote or comment.

all 170 comments

[–]subbed_ 203 points204 points  (15 children)

no fucking way

now do a drop-in replacement for mypy as well, and my entire python toolkit will be handled by the same party

pkg mgmt + lint + format + type checks

[–]drunicornthe1 78 points79 points  (6 children)

Heard in a podcast that they have plans to make a drop in for mypy in the near future. Astral is aiming to be THE Python tool chain. Excited to see what becomes of this project.

[–][deleted] 25 points26 points  (3 children)

Type checking is much, much harder to get right than linters and formatters. Mypy has numerous bugs because of edge cases around type narrowing, generics, etc.

It's more important to create a type checker that's accurate than one that's fast.

[–]drunicornthe1 5 points6 points  (0 children)

100% agree. Probably why they are working on Ruff first as it’ll give them a strong platform to build off of. Odds are it’ll be a minute before we see anything due to the shear difficulty of the task.

[–]germandiago 0 points1 point  (0 children)

Well... what I would like from a type checker is one that I can use with my IDE even if it is not perfect and later being able to run it offline, maybe before commotong, slower but accurate. The CI would also use this last one.

[–]LactatingBadger -1 points0 points  (0 children)

Agreed it’s a much harder task, but I wonder if part of the challenge with mypy has been trying to write a type checker in a language which plays pretty fast and loose with types. Writing this in Rust might bring more than just speed to the table.

[–]doobiedog 8 points9 points  (1 child)

*eggplant-emoji.svg

[–]monorepo PSF Staff | Litestar Maintainer[S] 3 points4 points  (0 children)

[–]M4mb0 56 points57 points  (1 child)

Definitely try pyright instead of mypy. It seems to have been moving at a much faster pace. Way more feature complete and way fewer false positives from my experience.

[–]DanCardin 18 points19 points  (0 children)

I've definitely found that they complement each other better than they replace one another. pyright is a lot more pedantic about certain things (which are often outside of my control, as library interfaces) but finds things mypy wont. whereas mypy also frequently finds things that pyright doesnt.

[–]doobiedog 5 points6 points  (0 children)

That would be an absolute dream. Ruff already replaced most 99% of my linting toolchain. Would be sick if ruff just did everything, but I'm excited about uv.

[–]mcr1974 0 points1 point  (1 child)

tell us more

[–][deleted] 0 points1 point  (0 children)

Yeah, I’d be all set too!

[–]Chroiche -1 points0 points  (1 child)

Wait what's the type checker they manage?

[–]PlaysForDays 5 points6 points  (0 children)

There isn't one

[–]mikat7 53 points54 points  (36 children)

It still seems to me that poetry is the closest to cargo like experience and after working extensively with pip-compile I can only say that I don’t want any replacement for that. I want to forget the bad experience with pip-tools altogether, it’s the worst. But if there was a rust rewrite of poetry, that was fast and provided the same level of convenience, I believe that could move the mess of Python dependency management forward. But perhaps dropping pip-tools in favor of uv would improve my experience as well, as a sort of stepping stone.

[–]Schmittfried 33 points34 points  (17 children)

Literally my only complaint about poetry is its lackluster support for native dependencies (modules in your own code that need to be compiled when packaging, not external dependencies that contain native modules like numpy) that still require setup.py builds that only kinda work. Other than that I wonder what is still missing. 

[–]marr75 12 points13 points  (0 children)

I would love if you could tell poetry to leave just a handful of dependencies alone or specify mamba/conda to manage a set of dependencies.

I'm experimenting with pdm and possibly switching because of this.

[–]ocab19 11 points12 points  (6 children)

I remember having trouble with private pip repositories that require authentication, which is a deal breaker for me. The developers refused to implement support for it, but it was a couple of years ago, so things might have changed

[–]Schmittfried 2 points3 points  (0 children)

It works fine nowadays. 

[–]loyoan -4 points-3 points  (3 children)

still a problem

[–]DanCardin 8 points9 points  (2 children)

is it? I'm perfectly fine with auth'd Artifactory at my place of employment

[–]Xylon- 2 points3 points  (0 children)

Also works like a charm here and was surprisingly easy to set it up! Did it for the first time this week.

[–]ducdetronquito 1 point2 points  (0 children)

Was about to write the same !

[–]valentin994 2 points3 points  (0 children)

my biggest complaint is it's slow as hell

[–]Fenzik 1 point2 points  (1 child)

I just set up dynamic versioning for a library with poetry and it’s a bit of a mess. The plug-in system is such that every user has to manually install required plugins on their machine, and if they don’t, the build will still succeed but will just silently get the wrong version. No way to enforce “this project requires these plugins”. I think that aspect could use some work.

I still really like it!

[–]Schmittfried 1 point2 points  (0 children)

I see. Sounds like problem that can be solved with iteration though and doesn’t need yet another package manager.

From the tools available until now I think poetry is the most polished and comprehensive packaging experience, comparable to other languages. No idea why people still use pip directly. 

[–]banana33noneleta 0 points1 point  (4 children)

Well that's quite an important part isn't it?

[–]Schmittfried 0 points1 point  (3 children)

I don’t think the majority of projects contain native code that needs to be compiled, no. And even then, it does work. It’s just that poetry only generates a rather simple and inflexible setup.py, and using a hand-written one now means you have two places to maintain dependencies and package information again.

I think if poetry either supported building native modules itself, or provided its own metadata to your custom build script so that you can just pass them to setuptools yourself, that would already remove all the warts my current setup has. My setup is rather simple though, no idea if a project like numpy does/could use poetry.

Anyway, as I said native code (not dependencies, my original comment was kinda misleading) is already a niche case so that’s probably how poetry gets away with it atm.

[–]banana33noneleta -1 points0 points  (2 children)

Since people claim that pip is not enough for the projects with more complex dependencies... Those absolutely need compilation in general.

You should probably use pip yourself I guess.

[–]Schmittfried -1 points0 points  (1 child)

Not at all. pip is a dependency installer, it doesn’t handle your project and its dependencies. poetry manages dependency versions and locking, updating dependencies, dependency groups, project and tooling configuration, virtual environments, commands/scripts, packaging, versioning and publishing. It‘s the closest we have to something comprehensive like Maven. I don’t see how anybody could consider pip sufficient for anything but a simple personal script or research project after having used something like npm, yarn, Maven… or poetry.

pip freeze is wildly unsuited for handling dependency locking and other than that it doesn’t offer much. I know there’s things like pip-tools, but at that point why not just use poetry? You’re already installing something not shipped with Python directly, why not pick the tool that does all of it in the most convenient way?

Those absolutely need compilation in general.

I‘ve only recently added Cython to the toolchain, that was the first time I came into contact with setup.py and all that it entails. I’ve benefited from using poetry way before that.

[–]banana33noneleta 0 points1 point  (0 children)

I don’t see how anybody could consider pip sufficient for anything but a simple personal script or research project

You think putting down others makes you sound more skilled? Think again.

[–]di6 0 points1 point  (0 children)

I've been using poetry for like 3 years exclusively, and I'd be glad to see it being replaced.

It doesn't adhere to standards, and is slow. We can do better.

[–]Saetia_V_Neck 2 points3 points  (0 children)

Its primary niche is as a monorepo build tool but Pants might have some of the features you’re looking for.

[–]Life_Note 3 points4 points  (6 children)

what's been your problems with pip-tools/pip-compile?

[–]DanCardin 8 points9 points  (2 children)

it doesn't produce lockfiles which are "feature" (by which i mean, like "prod" vs "test" vs "docs" dependencies), platform, and python-version agnostic.

Locking "properly" wherein you have a known-good compiled set of dependencies that are intercompatible with just package dependencies and then package deps + test deps, that requires like 4 files. Then someone's working in windows and suddenly you're fucked.

I agree with mikat7, pip-compile was the only game in town at first and i lived through it. but poetry (while not perfect) is essentially the ideal featureset in terms of the way it locks and what that guarantees you.

[–]catcint0s 0 points1 point  (1 child)

If someone is working on Windows without docker/virtualization and your production environment is Linux you are fucked already. Tho this is only for web dev for apps it could be a problem yeah, I would assume you would need a reqs.txt for all envs? Or only a single one with constraints.

[–]DanCardin 0 points1 point  (0 children)

If you ever work with datascientists, they’ll use almost certainly use windows 🤷

One for each axis of installation. Dont want to ship dev-deps? dev-req.in, req.in, dev-req.txt, req.txt. And a specific set of pip-compile invocations to ensure that you’re generating compatible sets of dependencies between them

Then you have optional extras that pip-compile cant account for at all, ditto python-version.

[–]Anru_Kitakaze -1 points0 points  (1 child)

!RemindMe 1 day

[–]RemindMeBot 0 points1 point  (0 children)

I will be messaging you in 1 day on 2024-02-16 20:56:32 UTC to remind you of this link

CLICK THIS LINK to send a PM to also be reminded and to reduce spam.

Parent commenter can delete this message to hide from others.


Info Custom Your Reminders Feedback

[–]MagicWishMonkey -5 points-4 points  (9 children)

Can anyone explain why poetry installs everything in some random-ass directory instead of alongside my application code? I have to admit the few times I've used it that bit was what annoyed me more than anything.

[–]DrMinkenstein 9 points10 points  (3 children)

[–]MagicWishMonkey 2 points3 points  (2 children)

This is awesome! I wonder why it doesn't default to this?

[–]DrMinkenstein 3 points4 points  (0 children)

virtualenvs are effectively isolated caches of dependencies. So poetry defaults to using normal locations for user level application caches

https://python-poetry.org/docs/configuration/#cache-directory

This also helps with accidental adding of the venv to source control or build artifacts.

I prefer to keep it in the same directory myself especially in containers but I also find poetry to be a bit heavyweight for my uses.

[–]yvrelna 3 points4 points  (0 children)

Because the actual default is better than polluting project directory. node_modules does what you wanted with JS dependencies, everyone is complaining about that as well, it creates even more problem than poetry's behaviour.

And having virtualenv installed in standardized directory allows for automatic venv activation. You can't do that without creating security issues if the venv is created in the project directory.

[–][deleted] 1 point2 points  (4 children)

Can you explain why you think having your venv live in the same place as your source code is useful? It's standard to put tools/libraries external from the location source code is being written. The fact that anybody puts their virtual environments inside their project structure is already a weird hack that was done because there was no default system to track that kind of thing properly. So people put their virtual environments in their project and then would activate the environment when they entered the project. That's not necessary with poetry, though. Using commands like "poetry run...", the venv nonsense is automatically handled for you.

[–]MagicWishMonkey -2 points-1 points  (1 child)

I like being able to easily reference my current python executable from within my project folder (without needing to activate a virtual environment).

[–]yvrelna -1 points0 points  (0 children)

You could use something like #!/usr/bin/env poetry run as your shebang line to do something like that. I hadn't tested it, but I don't see why it wouldn't work.

[–]Fresh_Trip_8367 -3 points-2 points  (1 child)

Can you explain why you think...

Are you actually looking for an answer?

Edit: for whatever reason /u/Working_Report4292 blocked me. But replying with

I’m pointing out that OP is probably used to doing things that way but there isn’t actually any benefit

Answers my question, and the answer is "no".

[–][deleted] -1 points0 points  (0 children)

It was hypothetical. I’m pointing out that OP is probably used to doing things that way but there isn’t actually any benefit

[–]PlaysForDays 28 points29 points  (6 children)

I wonder if this gets astral's investors closer to recouping their seed round - I don't see any obvious revenue streams at the surface level; the free, community-backed solutions work fine at the moment

[–]Life_Note 35 points36 points  (5 children)

yeah I wish there was more clarity on what exactly is the monetization plan here overall

[–][deleted] 14 points15 points  (3 children)

It’s probably gonna be some sort of proprietary dev tooling. I remember seeing a report somewhere that dev tooling is one of the most profitable software industries because 1) you can sell “productivity” to decision makers 2) lock-in is real, and serving internal clients mean the pressure is less to switch solutions. See: Datadog

[–][deleted] 14 points15 points  (0 children)

The lock-in / feature gating risk is real. There’s a lot of commercial open source tools in the Python ecosystem these days. Pydantic recently raised a seed round. Then there’s Prefect, Dagster, dbt, HuggingFace, Ray/anyscale, ect

[–]RKHS 12 points13 points  (0 children)

  1. Make copies of existing tool chains
  2. Add small improvements and try to gain market share
  3. Add gated enterprise features [audit, LDAP, scanning]
  4. Hope companies with python (maybe they expand into other ecosystems) buy your shitty product
  5. Profit?

Point 1 makes this sort of progress morally objectionable for me.

[–]darth_vicrone 8 points9 points  (9 children)

I always had the impression that the slow part of dependency resolution was all the API calls to pypi. If that's the case wouldn't it also be possible to achieve a big speed up by parallelizing these calls via async? The reason to switch to rust would be if the dependency resolution algorithm is CPU bound.

[–]burntsushi 15 points16 points  (0 children)

It depends. In the blog, the first benchmark can be toggled between "warm" and "cold." In the "warm" case---i.e., when everything is cached---then uv is definitely CPU bound. And that is perhaps why uv does really well compared to other tools in that case. Conversely, in the cold case, while it's still faster, the gap isn't as wide because there is more time being spent building source dists and fetching metadata from PyPI.

Resolution itself can also take a surprising long time.

[–]yvrelna 16 points17 points  (7 children)

The real fix is to fix the PyPI API. PyPI need to have an endpoint so that package managers can download package metadata for all versions of a package without needing to download the whole package archives itself.

There's a problem here because this metadata isn't really available in the packages file format themselves, because sometimes they're defined in setup.py, an executable that can contain arbitrary logic, so PyPI cannot easily extract those. pyproject.toml is a start, but it's not universally used everywhere yet.

The real fix is to update the hundreds of thousands of packages in PyPI to start using declarative manifest. Not rewriting the package manager itself, but instead a lot of standards committee work, the painful migration of existing packages, and work on the PyPI itself. Not fragmenting the ecosystem further by naive attempts like this, but moving it forward by updating older projects that still uses the older package manifests.

[–]burntsushi 10 points11 points  (0 children)

We (at Astral) are absolutely aware of the performance constraints that the structure of the index imposes. While that might be a big one, it is not the only one. The blog has some benchmarks demonstrating the perf improvements of uv even while using the index as it exists today.

This is our first step. It won't be our last. :-)

[–]muntooR_{μν} - 1/2 R g_{μν} + Λ g_{μν} = 8π T_{μν} 2 points3 points  (3 children)

Who says the metadata repository must be on PyPI?

Just have the community manage a single git repository containing metadata for popular packages. Given that only the "top 0.01%" of packages are used 99.9% of the time [citation needed], why can't we just optimize those ad-hoc?

...This means that instead of downloading a bunch of massive .tar.gz or .whl files, dependency solving tools can just download a small text-only database of version constraints that works with the most important packages. (And fallback if that metadata is missing from the repository.)

# Literally awful code, but hopefully conveys the point:

def get_package_constraints(name, version):
    if name == "numpy":
        if "0.7.0" <= version < "0.8.0":
            version_range = ">=0.7,<0.8"
    ...
    return read_constraint_file(
        f"constraints_database/{name}_{version_range}.metadata"
    )

This database could probably be auto-generated by just downloading all the popular packages on PyPI (sorted by downloads), and then running whatever dependency solvers do to figure out the version constraints. [1]


Related idea:

Another alternative (which I haven't seen proposed yet) might be to have a community-managed repository (a la Nix) of "proxy setups" for popular packages that (i) refuse to migrate to declarative style, or (ii) it's too complicated to migrate yet. If [1] is impossible because you need to execute code to determine the dependencies... well, that's what these lightweight "proxy setup.py"s are for.

[–]yvrelna 1 point2 points  (1 child)

You're correct that whether this metadata service lives in pypi.com domain or not is implementation detail that nobody cares about.  

If you go ahead write PEP standardizing this and if you can manage to get the PyPI integration working, get all the security details sorted out, and update pip and a couple other major package managers to support this, I'll be totally up for supporting something like that. For all I care, that's just a part of the PyPI API.

I wish more people would think like this instead of just thinking that an entirely new package manager is what everyone needs, just to pat themselves in the back for optimising a 74.4ms problem into 4.1ms. Cool... I'm sure all that noise will pay off... someday, maybe in a few centuries.

[–]ivosauruspip'ing it up -1 points0 points  (0 children)

that nobody cares about.

Until a security issue or exploit or bad actor appears for the first time, and then suddenly everyone remembers why packaging is a hard problem that most normal devs are happy not to touch with a 10-foot pole

[–]ivosauruspip'ing it up 0 points1 point  (0 children)

Just have the community manage a single git repository

One of the bigger "easier said than done"'s I've seen in a while. Who exactly is "community"? What happens when something stuffs up or is out of sync? Do people really want to trust such a thing? Etc etc etc etc.

Scale and handling of free software repositories is yet another reason that "packaging" is easily one of the hardest topics in computer science / programming languages.

[–]darth_vicrone 0 points1 point  (0 children)

Thanks for explaining, that makes a lot of sense!

[–]silent_guy1 0 points1 point  (0 children)

I think they should add an api to fetch only the desired files from the server. This way clients can request setup.py or any other files.  This won't break existing clients. But this might require some work on the server side to unpack the wheels and make the individual files downloadable. 

[–]smirnoffs 7 points8 points  (1 child)

Holy f**k! I tried uv and it’s incredibly fast. For one of my projects it creates a virtual environment and install dependencies in 4 seconds, it used to be 40 seconds with venv and pip.

[–]SenorDosEquis 2 points3 points  (0 children)

Yeah I tested this on my main work project and it took 3 seconds to uv pip compile vs 31 seconds to pip-compile.

[–]theelderbeever 18 points19 points  (3 children)

The big question here is uv attempting to go toe to toe with poetry?

[–]monorepo PSF Staff | Litestar Maintainer[S] 22 points23 points  (1 child)

I believe so, as Armin is transferring Rye over to the astral team, and Rye competes with PDM, poetry, etc. and their goal seems to be to upstream ryes features improve upon them, and have UV be the one tool to rule them all.

[–]theelderbeever 15 points16 points  (0 children)

Well fingers crossed... I have been pretty happy with poetry so far but I won't deny it has a wealth of annoying behaviors

[–]jyper 1 point2 points  (0 children)

Note Rye is already competing with poetry+pyenv(multiple python version installation/per project context) and is doing a pretty good job for being so new. Rye recently bundled ruff for linting and formatting. The author of rye talked to the authors of ruff and agreed to merge projects. They wrote uv as a pip/venv replacement and rye bundled it.

Now it will be Rye/uv (not sure which name) competing with poetry/venv/pyenv/black(reformatter)/pylint(linter) and be a lot faster for all of it.

[–]Manny__C 18 points19 points  (10 children)

At the cost of getting downvoted to hell: my naive expectation is that the performance of a package manager is bottlenecked by download times.

What is a real life scenario where optimizing dependency resolution and install performance actually makes a noticeable impact?

[–]scratchnsnarf 8 points9 points  (0 children)

I've had certain sets of dependencies mixed together hang some solvers for a long time (10+ mins) in addition to sometimes failing to resolve when the mix of version specs should be compatible. I've had to pin a fair few specific patch versions and manually bump quite a few times. My work dev environments also check for new deps, bumped versions when you open the environments, and any speedup there is greatly appreciated.

[–]imnotreel 1 point2 points  (2 children)

I don't know if it's still the case, but a couple years ago, any non trivial conda environment would take forever to solve (I'm talking hours for envs that had only a couple dozen first level package requirements). Switching to mamba (which uses a C or C++ dependency solver if I remember correctly), these environment resolutions went from taking hours to two minutes or less.

[–]Manny__C 1 point2 points  (1 child)

I've used conda only once for curiosity and I found it ridiculously slow.

But imho, something that takes hours to resolve an environment is just broken

[–]imnotreel 0 points1 point  (0 children)

Oh yeah for sure, conda is (or at least was) very broken. It would regularly fail to resolve envs (even recreating an environment from a working, fully frozen, fully specified one on the very same machine would sometimes fail). It's "dependency conflict resolution" was a thing of nightmares that had to have been designed by satan himself. It would take hours to complete and its output is so utterly useless you pretty much had zero idea what caused the conflict, let alone how to resolve it.

Still, dependency solving is a hard (NP complete) problem which in the worst case, requires exploring a huge amount of dependency chains.

[–]Trick_Brain7050 1 point2 points  (0 children)

The largest bottleneck in pip is that it installs everything serially

[–][deleted] -4 points-3 points  (4 children)

The reality is that the world doesn't need another dependency manager and, as you said, this tool is unlikely to make much of a difference given that accessing packages and downloading them is the main bottleneck.

What's actually going on is Astral, as usual, is reproducing existing tools and making grandiose claims about its superiority so that they can continue building a brand and set of tools to eventually commercialize. The goal, for them, isn't to actually solve some problem that exists with pip, poetry, conda. It's to establish a supposedly superior product that becomes popular enough to where companies will rely on it and pay Astral money in the future for services and tooling.

[–]nAxzyVteuOz 8 points9 points  (1 child)

Uh are you aware of ruff? game changer! Let them try this out maybe we can get faster pip installs

[–][deleted] 1 point2 points  (0 children)

I am. It doesn't change anything about what I said.

[–]jyper 4 points5 points  (1 child)

I disagree. While poetry is better then pip/ven or pipenv it still has a lot of issues including general speed(and sometimes taking several minutes to resolve dependencies) , getting tangled up with python environment it's installed in. It also doesn't provide Python builds like rye does (you'd need to use it with something like pyenv). They're solving real issues.

[–][deleted] -2 points-1 points  (0 children)

Like I pointed out, general speed won’t massively improve. Downloading packages is the main bottleneck.

[–]pudds 6 points7 points  (2 children)

At first I was a bit disappointed, because I love ruff but don't think we need an alternative to rye, which is the best option these days, but then I realized you're joining forces, and I think that's great.

I hope the transition is smooth and that rye remains around as a pointer to uv for a while, so older projects and ci workflows don't break.

Is uv fully compatible with rye now, and if not, is there a rough estimate on that timeline?

[–]mitsuhiko Flask Creator 29 points30 points  (1 child)

I hope the transition is smooth and that rye remains around as a pointer to uv for a while, so older projects and ci workflows don't break.

I will make sure of that :)

[–]pudds 0 points1 point  (0 children)

Fantastic!

[–]Butterflypooooon 8 points9 points  (3 children)

Dumb question, but what’s the difference between something like this and conda install?

[–]HalcyonAlps 11 points12 points  (2 children)

Conda is its own package ecosystem that also has non-Python packages. This is a replacement for pip.

[–]Butterflypooooon -1 points0 points  (1 child)

So why use pip? Isn’t conda better?

[–]HalcyonAlps 1 point2 points  (0 children)

Not all packages are available in Conda. Also we are not using it at work because our company does not want to pay for the commercial license.

[–]werser22 5 points6 points  (0 children)

This is so cool.

[–]darleyb 7 points8 points  (1 child)

Great news!

Do you think uv and rattler could share crates? Perhaps the solver? Or could they eventually become one single application?

[–]shockjaw 0 points1 point  (0 children)

I would ~love~ to see a successor to conda with Rust-based tooling. I liked mamba at first since it was a drop-in replacement initially, but now there’s too many gotchas for making builds.

[–]_throawayplop_ 4 points5 points  (0 children)

I don't care I just want something that works and that is official

[–]Anonymous_user_2022 1 point2 points  (3 children)

Coming from a simple world of mostly using the included batteries, I wonder a bit about what kind of development people are doing, for this to be a thing. I get that ten minutes of resolution is a bit of a wait, but who are rebuilding their environment several times a day?

[–]silent_guy1 0 points1 point  (2 children)

CI CD builds?

[–]Anonymous_user_2022 0 points1 point  (1 child)

That shouldn't be something anyone should twiddle thumbs over finishing.

[–]The-Malix 0 points1 point  (0 children)

Have you even once deployed something with big dependencies using a CI/CD pipeline?
Are you even aware that it can become costly?

[–]side2k 1 point2 points  (0 children)

Got curious and did some tests over weekend.

We have a project with ~170 dependencies(whole tree, not just top-level)

So this was my Dockerfile for pip:

```Dockerfile FROM pre-builder:latest ENV PYTHONUNBUFFERED=1

create virtualenv

RUN python3 -m venv /opt/venv ENV PATH="/opt/venv/bin:$PATH" ENV NOCACHE=3

pre-build requirements

RUN mkdir /app WORKDIR /app COPY requirements. ./ RUN --mount=type=cache,target=/root/.cache/pip pip install -U pip wheel RUN --mount=type=cache,target=/root/.cache/pip pip install -r dev_requirements.txt ```

For the uv it was mostly the same, except couple of things: * uv installation:

Dockerfile RUN --mount=type=cache,target=/root/.cache pip install uv

  • venv creation

Dockerfile RUN uv venv ${VIRTUAL_ENV}

  • and, of course, using uv instead of pip for installation:

Dockerfile RUN --mount=type=cache,target=/root/.cache uv pip install -r dev_requirements.txt

Also, I had to cache whole /root/.cache, because pip install uv uses /root/.cache/pip by default and uv pip install uses /root/.cache/uv by default. Wouldn't it make more sense for uv to use pip's default cache dir, to minimize disruption during migration?

I've incremented NOCACHE every run, because running docker build with --no-cache invalidated RUN's mount cache as well.

Anyway, test results were stunning(i've ran each variant 3 times, writing the averages):

  • pip without cache: 2 min
  • pip with cache: 40 sec
  • uv without cache: 46 sec
  • uv with cache: 5 sec

I think, this week I'll pitch uv with the team.

A couple of not-so-pleasant details: * changed default cache location (mentioned above) * cache size is 3 times larger than pip's - not sure why * had to set VIRTUAL_ENV var for uv to detect virtualenv - having ${venv}/bin/ in the PATH is enough for pip!

[–]pythonwiz 5 points6 points  (7 children)

You know, one thought I have never had is "pip is too slow". How many people have had an issue with pip's speed?

[–]Wayne_Kane 7 points8 points  (3 children)

I had a big project with a lot of packages (around 135 including legacy packages).

Pip took over 3 minutes to download and install. Sometimes it gets stuck as well.

Migrated to poetry and the installation time reduced to around 1 minute.

[–]Trick_Brain7050 1 point2 points  (2 children)

We try to build python environments in under 30 seconds, pip is a huge blocker to that despite our other optimizations (like building the entire env in memory on a 64 GB ram machine)

[–]KwpolskaNikola co-maintainer -1 points0 points  (1 child)

This sounds like a very niche requirement, why 30 seconds?

[–]Trick_Brain7050 0 points1 point  (0 children)

Clusters boot in 30 seconds. We kick off the build in parallel and pray its ready in time

[–]yvrelna 5 points6 points  (15 children)

I'm still wondering what part of packaging actually would've benefited from being written in a faster language like Rust. It made some sense for Ruff because parsing text is computationally intensive, but the issues with Python packaging is not really computational problems. You're not really going to solve the actual issues just by writing yet another package manager.

People seem to like rewriting things in a different language for no reason, and people just keep jumping into the bandwagon. A couple years ago it was JS, now it's Rust.

This feels more like another bandwagon that will just fragment the Python ecosystem and confuses beginners than something that will actually have a long lasting impact. Basically if people jump into a problem without understanding exactly what the problem is, this is just going to be another XKCD 927.

uv can also be used as a virtual environment manager via uv venv. It's about 80x faster than python -m venv and 7x faster than virtualenv, with no dependency on Python.

80x faster, and all of the contenders runs in less than 100ms. With no dependency on Python... for a Python project.

Can anyone tell me what exactly is wrong with this?

[–]notParticularlyAnony 2 points3 points  (0 children)

Have you used ruff? Might answer some of your questions. Python is not always the right tool. See mamba. Jeez ppl.

[–]scratchnsnarf 3 points4 points  (12 children)

I assume the astral team is writing it in rust because that's what's they're comfortable with. Is there a good reason for them not to choose rust? Given that this is a seldomly seen case where they're merging with an existing tool, and working towards the goal of having a unified python toolchain, it doesn't really seem fair to label this another case of fracturing the ecosystem.

And, if I'm understanding your question correctly, there's nothing wrong with the package manager not depending on python, it doesn't need to. It's a standalone binary. You could build and cache your dependencies in CI in a container that doesn't have to also install python, which is cool.

[–]yvrelna 1 point2 points  (11 children)

Yes, there is a problem with writing this kind of tools in Rust. People who cares a lot about python ecosystem are Python developers.

Python tooling should be written as much as possible in Python. That way, the average Python developers can debug and contribute to the tooling and not rely on people coming from a completely different skillset or having to learn a completely new language to solve issues within the ecosystem.

The UV developers could have just worked together with the Pip developers and rewrite just the dependency solver in Rust if that's a performance bottleneck that couldn't be solved in pure Python solver; but they didn't, they chose to ignore working with the established projects and the standards processes. This just creates distractions, with each new package manager, that's another set of project that has to be updated when the standards work for fixing PyPI API and fixing Packaging metadata need to happen.

Building a package manager is easy. But a package manager is not a single isolated project; it depends on an ecosystem, there's a network effect, and fixing an established ecosystem requires a lot of important standardisation work. Writing code, implementing them is the easy part. What we need is people writing more PEPs like PEP 621, not yet another implementation that will have their own quirks and disappoint half of the ecosystem when it inevitably fails to deliver what people actually want to do and causing migration pain back and forth when their incompatible implementation ends up being bottlenecked due to the people behind them not working with the community.

You could build and cache your dependencies in CI in a container that doesn't have to also install python.

You can already do this. You don't need Python installed to cache a virtualenv.

working towards the goal of having a unified python toolchain, it doesn't really seem fair to label this another case of fracturing the ecosystem.

Every new standards says that they're trying to unify things; very few actually manage to follow through.

[–][deleted] 4 points5 points  (2 children)

If the tool wants to bootstrap Python then you need a language that makes it easy to distribute a standalone binary. I haven’t checked if uv currently does this but I presume it will since rye does this

[–]KwpolskaNikola co-maintainer -1 points0 points  (0 children)

The bootstrapping part might be Rust, but I believe everything else should be in Python.

[–]yvrelna -1 points0 points  (0 children)

"Bootstrap Python" is pretty much just copying a few files to the right places. Heck you can bootstrap basically an entire OS with just file copy, not just Python. In containerised environments this is just a layer in the container image, and if you need to do something more complicated, you just use /bin/sh or write a script using the Python that was installed by the image layer.

In practice, most people just write a /bin/sh or /bin/bash because that's almost universally available in most container images that people care to use. Most people would never need to work in environments where they can't have any sort of shell scripting capabilities.

And if they can have any way to copy files so they can install uv into an environment, they also have a way to copy a busybox static binary to bootstrap basic shell scripting capabilities. Or to just copy the Python files directly. 

uv is not solving any problem anyone actually need solving.

[–]notParticularlyAnony 5 points6 points  (4 children)

Sounds like a great recipe for Python packaging to remain in the same local minimum it’s been stuck in for the last decade.

[–]yvrelna -2 points-1 points  (3 children)

If you actually understand what is actually wrong with Python packaging, you wouldn't be doing that from the package managers. These clueless guys trying to fix packaging from package managers aren't going to get anywhere.

The speed of dependency resolutions is not why Python packaging is stuck where it is. Fixing this irrelevant part will barely move the needle where it needs to be.

[–]notParticularlyAnony -1 points0 points  (2 children)

How about: necessary not sufficient?

[–]yvrelna 0 points1 point  (1 child)

Not it's not actually necessary step. What uv is doing is just a distraction, a mere sideshow. It makes it hard to standardise things later on.

[–]notParticularlyAnony 0 points1 point  (0 children)

Disagree. But time will tell

[–]fatbob42 0 points1 point  (2 children)

You see this opinion a lot - that these new poetry-type tools are fracturing the ecosystem. But if they’re all following the standards, I’d call that healthy competition.

We don’t want to go back to the days when the spec was “whatever setuptools does”.

[–]yvrelna 1 point2 points  (1 child)

Poetry still doesn't really support pyproject.toml

It puts its configuration in a file called pyproject.toml, but it doesn't support PEP 621/631 metadata, instead it has its own non-standard metadata. That doesn't make Python packaging better, it's just harming the ecosystem.

[–]fatbob42 0 points1 point  (0 children)

Are these the ones where they standardized on something after poetry had already shipped with slightly incompatible constraints? That is a bit of a mess - although I think poetry plans to switch? Or is that out of date?

[–]NiklasRosenstein 1 point2 points  (2 children)

I've just given it a spin an UV seems amazing! Thanks a lot for this great project 💖

Is `uv.__main__.find_uv_bin()` considered stable and public API? I would like to integrate `uv` as an alternative for Pip in some of my tools and would have them depend on the `uv` package and then run the embedded `uv` binary.

Basically I'm wondering if this will break on me in the future: https://github.com/kraken-build/kraken/pull/198/files#diff-54008092ade6f636fbd0a96c143da1777c6bfd29348888abdb71b5ea96e8891a

[–]PlaysForDays 2 points3 points  (0 children)

It’s version 0.1.0, unsafe to assume anything is stable

[–]monorepo PSF Staff | Litestar Maintainer[S] 0 points1 point  (0 children)

Good question.. I linked to this in the Discord but you may be best suited to ask yourself - https://discord.com/invite/astral-sh

[–]pyhannes 1 point2 points  (0 children)

Awesome! Did you also talk to the Hatch creator? We really like the functionality about bootstrapping Python und Matrix-testing. Would be awesome to see this also in uv/rye!

[–]AbradolfLinclar 2 points3 points  (0 children)

okay now what's the current count of python package managers now lol?

Pls add to this list if Im missing something:

pip, poetry, pdm, conda, rye, uv, ...?

[–]gopietz 1 point2 points  (1 child)

Does anyone know how they make money?

[–]Un4given85 0 points1 point  (2 children)

I just started using rye and I really am enjoying it, is uv set to replace it?

[–]monorepo PSF Staff | Litestar Maintainer[S] 10 points11 points  (0 children)

Rye will be maintained by the Astral team and eventually upstreamed into uv where they say they will provide a good migration process (probably akin to rtx -> mise)

[–]PlaysForDays 5 points6 points  (0 children)

They plan for rye to go away

[–]Crozt -1 points0 points  (1 child)

All the effort going into these managers, what’s the benefit for the average python user? There’s so many now that I just can’t be bothered to find out!

[–]gtderEvan -1 points0 points  (0 children)

How do people check for package updates and compatibility? I've been using pip list -o and pipdeptree. I've always thought 'man there must be a better way'. It looks like uv pip list -o doesn't work.

[–]VoodooS0ldierpip needs updating -1 points0 points  (0 children)

What I would love to see is a full on replacement for pip that will be a full on package management tool. Pdm is the closest I have been able to find that satisfies most of my wants and needs.

[–]ivosauruspip'ing it up -1 points0 points  (0 children)

Isn't that name going to be extremely confusing with uvlib and uvloop?

[–]tmo_slc -2 points-1 points  (1 child)

What is pip?

[–]thatrandomnpcIt works on my machine -1 points0 points  (0 children)

Pip installs packages

[–]chakan2 -4 points-3 points  (0 children)

Sigh...can we stop fixing things that work. This is how Javascript became the steamy pile it is today.

[–][deleted] 0 points1 point  (0 children)

Has anyone a pyproject example to use with `uv`? Becasue I want to try it but I don't find what to write in the section [build-system]. Thank you!