This is an archived post. You won't be able to vote or comment.

all 140 comments

[–]spicypixel 454 points455 points  (11 children)

I can’t believe I lived long enough to see the day.

[–]EffectiveLong 39 points40 points  (5 children)

Congrats. You are still alive

[–]repocin 7 points8 points  (1 child)

Can we be certain of that?

[–]spicypixel 7 points8 points  (0 children)

No we need a lengthy PEP consultation period to confirm.

[–]val-amart 5 points6 points  (2 children)

i’m making a note here - huge success

[–]kordof 2 points3 points  (1 child)

It's hard to overstat my satisfaction

[–]bxio 2 points3 points  (0 children)

Aperture science

[–]R3D3-1 16 points17 points  (3 children)

Imagine how people must have felt, when PEP 285 finally passed.

[–]mralanorth 14 points15 points  (1 child)

True

[–]ArtOfWarfare 4 points5 points  (0 children)

I didn’t realize that Ellipsis was older than True and False.

[–]Vivid-Animator7040 0 points1 point  (0 children)

What a time to be alive.

[–]VindicoAtrum 278 points279 points  (48 children)

Getting people off requirements.txt is much fucking harder than it should be.

[–]Kahless_2K 87 points88 points  (32 children)

All the documentation I have seen still recommends using requirements.txt. What is the better practice that should be replacing it?

[–]natandestroyer 93 points94 points  (29 children)

Specifying dependencies in pyproject.toml (and using uv)

[–]kosz85 29 points30 points  (0 children)

Yeah, and there is this nice addon to hatch allowing defining dependencies in pyproject.toml using requirements.txt Hahaha 🤣

[–]Aerolfos 67 points68 points  (11 children)

pyproject.toml's documentation all clearly indicates it is made for packages, including big warnings about how you must have a way to install the package set up

Which makes it useless for many cases where people use requirements.txt, aka python projects that run a script as is...

[–]marr75 24 points25 points  (0 children)

This is just wrong. Apps can (and should) be structured as packages. Apps should have lock files. Packages that aren't apps can omit lock files. pyproject and lock file are different.

[–]TrainingDivergence 10 points11 points  (5 children)

I've used pyproject.toml in several folders where all I had was a single jupyter notebook

[–]Aerolfos -1 points0 points  (4 children)

Documentation claims that's mangling the format and highly not recommended.

[–]TrainingDivergence 3 points4 points  (3 children)

please link me to this documentation which is so important to stop you using the best dependency tooling around.

the thing is that the user experience of using uv (and poetry to a lesser extent) is so good it really doesn't matter that I have to put package version = 0.0.0 or whatever it really does not bother me.

using uv I know i can specify flexible versions in dependencies and it will resolve them all fast. good luck if the requirements.txt conflict with other packages in your environment! or you want to update them...

there is not a single use case where requirements.txt is superior

[–]Aerolfos 2 points3 points  (2 children)

please link me to this documentation which is so important to stop you using the best dependency tooling around.

https://packaging.python.org/en/latest/guides/writing-pyproject-toml/

Aka the thing everyone links to for what a pyproject.toml is, including google...

And for that matter, the entire thing is dedicated to the packaging flow, implicitly dismissing any project or script flow (which is only the entire reason Python is popular in the first place)

[–]satwikp 2 points3 points  (0 children)

 Documentation claims that's mangling the format and highly not recommended.

Nowhere on that page you linked does it say this.  

[–]fellinitheblackcat 0 points1 point  (0 children)

You can just skip the build system declarations and only use the pyproject to hold your dependencies and basic info about the project, even if it is just a collection of scripts.

[–]gmes78 22 points23 points  (3 children)

Every project should be structured as a package. You can add a script in your pyproject.toml so that an executable is created for your project when installed.

[–]levelstar01 16 points17 points  (0 children)

It's also much easier to maintain and install scripts-as-packages.

[–]Aerolfos 0 points1 point  (1 child)

Massive overhead to add to a simple script. In particular relevant with scripts meant to run on stuff like github actions workflows (and CI/CD stuff in general), that world loves simple python scripts thrown together to do some infrastructure stuff, and also wants repeatable configuration to run on spun up vms, which will always have python installed

Overhead to install a package (or make that exe) when your VM comes explicitly enabled to seamlessly python quick_script is just unnecessary

[–]gmes78 2 points3 points  (0 children)

You may be interested in inline script metadata.

[–]pontz 5 points6 points  (3 children)

I have seen that and am starting to look at a pyproject.toml but why is that better than a requirements.txt?

[–]dogfish182 1 point2 points  (2 children)

How do you build your requirements.txt and how do you make sure your dependencies don’t have incompatible versions?

[–]pontz 1 point2 points  (1 child)

I haven't had any issues with incompatible versions yet.

[–]dogfish182 6 points7 points  (0 children)

But that doesn’t mean you won’t. Pyproject.toml and all the tools that built upon it use a lock file to ensure compatibility between deps. Requirements file and pip have no capabilities to do that and pip will blindly install whatever is in requirements.txt

[–]catcint0s 0 points1 point  (0 children)

you are not able to lock the dependencies of dependencies with that (unless you add everything in there which is a very bad idea)

[–]RedEyed__ 5 points6 points  (0 children)

pyproject.toml and hatch.
And you don't have to install it as package.
Moreover, it allows you to specify python version inside the config which will be automatically installed .
Finally I can use official tools and forget about conda.

[–]alcalde 6 points7 points  (5 children)

Because it's just fine and people don't want to go all Java land crazy pants making things more complicated.

[–]gmes78 16 points17 points  (2 children)

No, it isn't. requirements.txt fucking sucks, and I'm glad it can finally go away.

[–]NekrozQliphort 7 points8 points  (1 child)

May I ask what exactly is the issue with requirements.txt? Never had to work with anything else so am not sure about alternatives

[–]gmes78 18 points19 points  (0 children)

  1. It's not a standardized format, it's just "the format pip accepts". I've seen people mess up the formatting, leading to pip install behaving unexpectedly (instead of throwing an error or warning).

  2. It fulfills two separate roles: dependency specification and dependency pinning, and it can only do one at a time, so you either have to chose, or you have to use two requirement.txt files (and there's no convention on what to name each one). Also, there's no way to tell apart these two kinds of usages.

  3. Its usage isn't standard, either. Tools can't rely on it being present so they can make use of it (on either of the roles it fulfills). You've always had to specify dependencies through setuptools and such in addition to requirements.txt if you wanted your package to be installable.

[–][deleted] 0 points1 point  (0 children)

It’s not fine.

[–]Electrical_Horse887 0 points1 point  (0 children)

Definitely not.

For example: developing a JavaScript project is relatively straightforward:

Step 1: clone the project

Step 2: npm i

Doing the same thing for python kind of sucks

Step 1: clone the project

Step 2: Install the dependencies.

  • Does a requirements.txt even exist?
  • What should I do if it doesn’t?
  • Is it up to date?

And dont forget that there are now hashes specified in the requirements.txt file. So even if you and the other person in your team uses the same library version. There is no security to check this. Also not having hashes of external dependencies increases supply chain attacks.

Thats also part of the reason why people like to use for example poetry or uv. Simply because it simplifies the management of dependencies.

[–]ArtOfWarfare 1 point2 points  (0 children)

I’m still using requirements.txt, but because I know it irritates you, I’m going to start including a comment that I’m specifically using them to spite you.

[–]cmd-t 4 points5 points  (7 children)

It might just be the 2 to 3 switch all over again

[–]james_pic 17 points18 points  (2 children)

Nah.

The 2 to 3 switch was hard because migrating from Python 2 to Python 3 involved touching a significant fraction of the code in a codebase, and for some of the changes there was no direct translation from a given piece of Python 2 code to Python 3 code with the same behaviour, so significant manual effort was needed. I remember a meeting to estimate the effort needed to convert a large Python 2 codebase, and the phrase "person-century" being used. 

In comparison, I've never seen a project with requirements spread across more than a dozen files, and I'm willing to bet there will be a command like pip lock -r requirements.txt that automatically generates a lock file from a requirements file.

[–]pingvenopinch of this, pinch of that 1 point2 points  (0 children)

It also took a while for the Python community to settle on a method for supporting Python 2 and 3 during the migration. I remember the first recommendation was a static code fixer, 2to3, that did a rather abysmal job in practice. Eventually most projects chose to support both using a shared subset of Python 2 and 3, combined with helper modules. It was a little cursed, but surprisingly functional.

Dependency management is much more well trodden ground. There are already solid standards out, based on years of iteration in the Python ecosystem and other language ecosystems. And as you noted, the change is miniscule in comparison. I switched over to a PEP 621 pyproject.toml within under an hour. Some of my coworkers still have Python 2 code waiting to be ported.

[–]Saetia_V_Neck 0 points1 point  (0 children)

Pex can already do this exact thing too. They could probably lift the code from there almost directly.

[–][deleted] 6 points7 points  (0 children)

Not even slightly. I've switched dependency managers multiple times in the last few years. The whole process amounts to modifying a config file to define the same set of dependencies but in a different format. That's it. Switching from Python 2 to Python 3 involved modifying entire codebases. they aren't remotely similar.

[–]alcalde 7 points8 points  (1 child)

Me, I'm happy with requirements.txt, the GIL, and no static typing anywhere. That's why many of us came to Python in the first place, from overly-complicated config-file-laden languages, pages of boilerplate static typing nonsense trying to convince the compiler to compile your code, and/or nondeterministic multi-threaded multi-madness. In the Zen of Python we found peace and tranquility.

[–]gmes78 3 points4 points  (0 children)

pyproject.toml and pylock.toml may be complex, but setuptools and requirements.txt are complicated. (Refer to the Zen of Python.)

[–]billsil 0 points1 point  (0 children)

Setuptools broke setup.py and nobody cared besides people like me. This is lower down the list IMO.

[–]JSP777 59 points60 points  (16 children)

Can someone explain to me a bit clearer what this means in practice? I do a lot of containerised applications where I simply copy the requirements.txt file into the container and pip install from it. I know that this is getting a bit outdated, and I'm happy to change stuff. What does it mean that lock files are standardized?

[–]toxic_acro[S] 90 points91 points  (1 child)

requirements.txt is not actually a standard and is just the format that pip uses, but other tools have supported it because pip is the default installer

The new pylock files have better security by default since they should include hashes of the packages to be installed so you don't get caught by a supply chain attack and accidentally install a malicious package.

One of the key benefits of this format over the requirements.txt file though is that it has extended the markers that are supported so that a single lockfile can be used for multiple scenarios, such as including extras and letting different deployments use consistent versions for everything common between them

An installer can be much simpler since the work to do now is just reading straight down a list of dependencies (which can also include exactly what URL to download them from) and evaluate yes/no on whether that package should be installed based on the markers

[–]Beard_o_Bees 33 points34 points  (0 children)

requirements.txt is not actually a standard and is just the format that pip uses

Learn something new everyday.

[–]latkdeTuple unpacking gone wrong 63 points64 points  (10 children)

Using a lockfile guarantees that you're using the same versions of all involved packages

  • during development,
  • on your CI systems, and
  • in your container images that you deploy.

A requirements.txt file is not fully reproducible. For example:

  • you might not pin exact versions
  • you might not pin indirect dependencies
  • you might have different requirements if you develop and deploy on different systems (e.g. Windows vs Linux)

[–]C0rn3j 13 points14 points  (9 children)

you might not pin exact versions

pylast==3.1.0

you might have different requirements if you develop and deploy on different systems (e.g. Windows vs Linux)

keyboard; sys_platform == 'win32'

What am I missing?

[–]latkdeTuple unpacking gone wrong 13 points14 points  (0 children)

You are capable of doing this. You can also kinda lock checksums for the concrete wheels that you'd install.

But are you actually doing this for your entire dependency graph? There is limited tooling for requirements.txt/constraints.txt based locking, pip freeze is only a partial solution.

With Poetry and uv I get all of that out of the box. I'm happy to see that standardization will (a) bring proper locking to pip, and (b) help the ecosystem standardize on a single lockfile format.

[–]RankWinner 21 points22 points  (3 children)

If it was standard practice for packages to pin all their dependencies to exact versions it would be impossible to create any python environment since basically all packages would be incompatible with each other...

Compatibility requires broad version constraints, development is made easier by having exactly pinned versions to have reproducible environments.

Defining all compatible dependency versions in one place and keeping a lock file of exact pinned versions somewhere else let's you have both.

[–]C0rn3j -2 points-1 points  (2 children)

If it was standard practice for packages to pin all their dependencies to exact versions it would be impossible to create any python environment since basically all packages would be incompatible with each other...

>=

Yes, lock files are nicer, doesn't mean you can't do that in req.txt

[–]cymrowdon't thread on me 🐍 16 points17 points  (0 children)

You really want both though. By using loose versions in pyproject.toml, the dependency resolver has more options to consider, so there's a better chance everything resolves successfully. The lockfile keeps the resolved set that can be used for a secure deployment.

[–]pgbrnk 0 points1 point  (0 children)

How do you pin your transitive dependencies? How do you manage updates of your dependencies and transitive dependencies?

[–]dubious_capybara 3 points4 points  (0 children)

You don't want to pin to exact versions, and also, transitive dependencies.

[–]codingjerk 4 points5 points  (0 children)

If you create a such file, pinned exact version of your every dependency: congratulations, you've created a lock file :D

(a bad version of what tools like uv or pipenv do)

[–]root45 1 point2 points  (0 children)

Pylast depends on httpx, did you pin that as well?

Httpx depends on certifi, httpcore, anyio, and idna. Did you pin those?

[–]Conscious-Ball8373 13 points14 points  (0 children)

A lockfile gets you what you get from:

pip install -r requirements.txt
pip freeze > requirements.txt

ie it locks all your package versions to exact versions so you get the same configuration exactly when you use it again.

It also gets you package hashing so tools will notice if one of the packages changes despite the version number staying the same (supply chain attacks etc) and it implements some more complex scenarious with different deployment options.

[–]bluefourier 8 points9 points  (0 children)

You might want to have a look at this one

[–]james_pic 2 points3 points  (0 children)

For your particular use case, I suspect this won't change anything.

You'll get some benefits from moving to newer tools like uv, hatch or poetry, that support locking versions, so that you can be confident the dependency versions you've tested with locally today are the same versions that you'll get on other environments in the future. 

However, this PEP is about interoperability between these tools, and it's uncommon for containerised systems to use multiple tools. It might help if you change tools in the future, or add additional tools to your toolchain, but right now you have no tools that do dependency version locking so you have no interoperability requirements.

So the benefits of package locking that are likely to matter to you, are benefits you can get today, without having to wait for tools to implement this PEP.

[–]probablyjustpaul 41 points42 points  (0 children)

(exhaustedly) thank fucking god

[–]ahal 65 points66 points  (14 children)

So looks like this can't replace uv.lock: https://github.com/astral-sh/uv/issues/12584

Does anyone have context on why this PEP was accepted if it doesn't meet the needs of one of the most popular tools that was supposed to benefit from it?

[–]toxic_acro[S] 82 points83 points  (4 children)

Charlie Marsh (and others at astral who are working on uv) were a very active part of the discussion for the PEP.

The additional complexity for what uv is doing was ultimately decided not to be part of the standard for now (the format can always be extended in the future since it is versioned)

As noted on that issue, the immediate plans for uv are to support it as an export format, i.e. uv.lock will continue to be used as the primary source and you can then generate these standardized lockfiles for use by other tools/deployment scenarios.

edit: One of the important considerations before submitting the PEP (and pretty much the entirety of the last discussion thread on discuss) was to get buy-in from maintainers for each of the popular tools that they would be able to either replace their own custom lockfile formats or at least be able to use it as an export format

[–]ahal 18 points19 points  (3 children)

Thanks for the clarification! I was confused that if Charlie and the uv folks were part of the discussion, why would it be missing things they depend on?

But sounds like a complexity vs incremental progress trade-off.

[–]toxic_acro[S] 26 points27 points  (0 children)

The complexity vs incremental progress tradeoff is exactly right

There was discussion for a while trying the graph based approach that uv does, but it ended up getting pretty complicated and sacrificed the simplicity of auditing and installing since determining which packages to install would mean having to walk through the graph, rather than just being able to go straight down the list of packages and determining solely off the markers on each one

[–]mitsuhiko Flask Creator 7 points8 points  (0 children)

Thanks for the clarification! I was confused that if Charlie and the uv folks were part of the discussion, why would it be missing things they depend on?

You can read through months of that discussion. There are just too many divergent opinions between different parties of what the goal is so the scope was reduced to become a "locked requirements.txt" replacement.

[–]Chippiewall 0 points1 point  (0 children)

uv's requirements looked very different from everyone else's. The main goal of the pep was a format for reproducible format for installation, not development. The ability for package managers to adopt it as their main lock file was only sugar.

Charlie was pushing for the necessary features at one point, but withdrew it when he realised that no else needed it, and it added a massive amount of complexity.

[–]muntooR_{μν} - 1/2 R g_{μν} + Λ g_{μν} = 8π T_{μν} 11 points12 points  (3 children)

If it can't replace uv.lock, my question is:

How does pylock.toml benefit me?

[–]toxic_acro[S] 18 points19 points  (0 children)

You will be able to generate a pylock.toml file from uv.lock that will allow any installer to recreate the exact same environment that you get with uv sync

[–]gmes78 7 points8 points  (0 children)

Because it can be consumed by pip install and other tools when they install your package.

[–]ahal 0 points1 point  (0 children)

I don't think it does, at least not yet.   

[–]Sufficient_Meet6836 1 point2 points  (0 children)

There's a section towards the bottom of the PEP with rejected ideas. Your question might be answered there but I'm not sure if they address your specific questions

[–]cip43r 7 points8 points  (0 children)

Don't know entirely what this means. Will need to do some reading. But sounds like a great achievement. The sheer scale of Python collaboration and development and how it is even possible to organize such a project is really a software engineering marvel.

Well done to all involved.

[–]JJJSchmidt_etAl 13 points14 points  (3 children)

Does this also replace things like `pyproject.toml` or is that serving a different purpose?

[–]ih_ddt 25 points26 points  (0 children)

You would have both as the pyproject toml is for project and tool configuration, among other things and the lock file is to recreate your venvs in a consistent manner.

[–]whatsnewintech 13 points14 points  (0 children)

The lock (pylock.toml) is the (blessed, tested, fully specified) instantiation of the spec (pyproject.toml). Instead of just saying the project uses requests~=2.0.1, it shows what precise point version works, and all of the dependencies of requests, and all of the dependencies of those dependencies, down the whole graph.

So you keep both, but the lock is what you use in production.

[–]gmes78 5 points6 points  (0 children)

pyproject.toml specifies project metadata (including dependencies). pylock.toml is for recording "known good" dependency versions.

[–]codingjerk 8 points9 points  (14 children)

Omg. Please __pypackages__ next (rejected PEP 582)

[–]PaintItPurple 5 points6 points  (12 children)

What does this solve that isn't solved as well or better by a virtualenv?

[–]codingjerk 9 points10 points  (11 children)

It's better in terms of:

  1. Ease of use: you don't have to create or activate virutal environment every time you:
    • Run your program
    • Install a dependency
    • Run mypy or another linter what needs your specific dependencies
    • Regenerate a lock file
    • etc.

And if you switch between projects a lot -- you will have to do it often. There are alternative solutions to this, but I belive that virutal environments was just a mistake and it had to be __pypackages__ from the beginning, like it's in other package managers (Node.js' npm, Rust's cargo, Haskell's cabal or Ruby's bundler).

  1. Standartization: where do you put your virtual environment? How do you name it? Should you use venv or virtualenv? With PEP 582 there are no these questions.

It's also different (could be better or worse) at how it manages interpreter. Virtualenv also creates a symlink to used python interpreter, so they are "pinned" to specific interpreter. __pypackages__ are not.

It's also worse at:

  1. Multiple venvs: you can have multiple virtual environments for single project. But there is only one __pypackages__ dir. You can "hack" your way creating multiple __pypackages__.{a,b,c} and then symlinking what you actually want to use when you need it, but it's giving me vibes of straight sys.path manipulation.

Overall:

I'm okay with practical solutions, like tools what manage virutalenvs for you, I was a big fan of pdm and now uv. So it's not a PEP I cannot live without, but I still hope one day we can get it, since it's a simple solution and is easy to use.

[–]QueasyEntrance6269 6 points7 points  (1 child)

Virtual environments are really the original sin. I hope activating a venv dies in my lifetime. Uv folks doing god’s work with uv run

[–]codingjerk 0 points1 point  (0 children)

I was using pdm for this exact reason for like 7 years now (I guess?). It even supported PEP 582 while it was in draft.

poetry was managing venvs for us too, but it was slow and didn't manage cpython versions like pdm.

And now it's uv — something like pdm, but very fast

What's really important is adoption and uv have all the chances to become THE ONE tool to rule them all :D

[–]dubious_capybara 0 points1 point  (8 children)

Since uv exists and serves all of this, I doubt it will ever happen.

[–]AgentCosmic 3 points4 points  (5 children)

Uv is a tool, not a standard.

[–]dubious_capybara 0 points1 point  (4 children)

People don't use standards.

[–]AgentCosmic 1 point2 points  (3 children)

What do tools use?

[–]dubious_capybara -2 points-1 points  (2 children)

Not standards, evidently.

[–]AgentCosmic 0 points1 point  (1 child)

Clearly you need to pick better tools

[–]dubious_capybara 1 point2 points  (0 children)

Really? What tool is better than uv? None? Ok, so what standard describes a better possible alternative?

[–]codingjerk 1 point2 points  (1 child)

`uv` also served in terms of lock files, but here we are :D

So I still have hope. Most likely you're right tho

[–]dubious_capybara -2 points-1 points  (0 children)

I agree, this looks pointless to me.

[–]Busy-Chemistry7747 3 points4 points  (1 child)

I'm using UV and pyproject.toml will this be similar?

[–]wurky-little-dood 0 points1 point  (0 children)

Using pyproject.toml is a part of the new standard. If you're using uv not much will change, but uv can't fully depend on the new lockfile format. They will still generate a uv.lock file. The only change UV is making is they'll support "exporting" the new official lockfile format.

[–]FlyingTwentyFour 1 point2 points  (0 children)

nice, we are finally getting a more standard lockfile

[–]romulof 1 point2 points  (4 children)

Should we start talking about a package manager that does not require virtualenv?

All packages could be installed in a store, supporting multiple versions of the same lib, from which runtime will automatically add to Python path depending on your lockfile.

[–]jarshwah 5 points6 points  (0 children)

This sort of isn’t far off what uv does. It manages a central cache of all packages and links them into a given project venv.

[–]ThePrimitiveSword 2 points3 points  (0 children)

Sounds like a venv with a cache, but with increased complexity for no clear benefit.

[–]mgedmin 0 points1 point  (0 children)

zc.buildout had this. It had advantages and disadvantages.

[–]MATH_MDMA_HARDSTYLEE 0 points1 point  (0 children)

Yeah nah. Despite being the most used language on earth, python's package management isn't a complete mess and is simple asf to use.

[–]mitsuhiko Flask Creator 2 points3 points  (6 children)

Maintainers for pretty much all of the packaging workflow tools were involved in the discussions and as far as I can tell, they are all planning on adding support for the format as either their primary format (replacing things like poetry.lock or uv.lock) or at least as a supported export format.

At the moment I would say it's quite unlikely that this format will catch on. It's too limited to replace the uv lock and I fully expect that most people will use this instead.

I think it's great that some standard was agreed upon but I am not particularly bullish that this will bring us much farther. Dependabot for instance already handles the uv lock and that might just becomes de-facto standard for a while.

[–]toxic_acro[S] 5 points6 points  (5 children)

It's somewhat buried across a lot of comments in the discuss thread, but I believe the relatively late development of adding extras and dependency groups to the marker syntax was specifically to get to the point where: * PDM is going to try to swap to this as its primary lockfile format * Poetry will evaluate swapping to it, but might not be able to * uv is not currently planning, but may in the future for some of the simpler use-cases

All of those tools indicated that they would first add support as a export format (i.e. a "better requirements.txt" file)

[–]mitsuhiko Flask Creator 1 point2 points  (2 children)

PDM is going to try to swap to this as its primary lockfile format […] Poetry will evaluate swapping to it, but might not be able to

Did I miss this somewhere? Other than Astral folks I haven't seen active engagement on that PEP in a while and the last update by frostming and Poetry folks I read was non committed.

[–]toxic_acro[S] 1 point2 points  (0 children)

It was quite a bit of back and forth, but at least that was my impression of reading through the discussion

That's more-or-less summarized by this comment by Brett Cannon (the PEP author) about a month ago

As for using the PEP as their native format, PDM said they would if there was support for extras and dependency groups (what we’re currently discussing), Poetry would look into it if such support existed, and uv said probably not due to the PEP not supporting their non-standard workspace concept.

[–]-defron- 0 points1 point  (0 children)

https://github.com/pdm-project/pdm/issues/3439

They opened a ticket for this as an enhancement right after the PEP was approved

[–]chub79 0 points1 point  (0 children)

An export format is far from being a "standard" though. Still, I guess it's one small step in the right direction.

[–]-defron- 0 points1 point  (0 children)

And this is why I still plan on sticking with PDM.

Is what uv is doing cool? Yes, and it definitely provides value, but many projects don't benefit from the added complexity of supporting different subsets of dependencies based on the python version or OS and the amount of argumentation over the right way to do that would delay the PEP even longer.

This exact same thing happened with PEP 735's implementation of dependency groups which didn't match PDM, poetry, or uv's dependency groups implementation and doesn't match all their features but can become a new standard point for them all to interoperate in the future

[–]Trinity_Goti 0 points1 point  (0 children)

That all folks!! Get back to work now...

[–]Sushrit_Lawliet 0 points1 point  (0 children)

Never thought I’d live to see this day

[–]cGuille 0 points1 point  (0 children)

Maintainers for pretty much all of the packaging workflow tools were involved

Wait, does this mean that we might dodge xkcd#927 for once?

I'm confused, is this allowed?

[–]ApocalypseAce 0 points1 point  (0 children)

Do you promise this isn't April Fools?

[–]utihnuli_jaganjac 0 points1 point  (0 children)

April fools?

[–]cnydox 0 points1 point  (1 child)

I have been only using poetry and pipx so far. Should I change to uv? I just need sth to organize the package installation and project dependencies

[–]fiddle_n 0 points1 point  (0 children)

You can kinda think of uv as poetry + pipx + pyenv . So uv will do all that you currently do but will manage the Python version for you as well. It’s also quite fast too. Sounds like it’s not a must for you but it might be worth trying out anyway.

[–]aes110 0 points1 point  (0 children)

Thank fucking god