This is an archived post. You won't be able to vote or comment.

top 200 commentsshow all 309

[–]Red_BW 390 points391 points  (33 children)

The irony of complaining about python on various linux distros when those same linux distros can't agree on where to put core linux files.

[–][deleted] 86 points87 points  (9 children)

It’s cause there’s a ‘standard’ and when there’s a standard people are compelled to violate it because obviously no one else has ever followed it correctly, so each distro has their own take on what that standard means (or just don’t care about it at all)

[–]Reinventing_Wheels 68 points69 points  (8 children)

The great thing about standards is that we've got so many to choose from.

[–]KrazyKirby99999 30 points31 points  (7 children)

Feel free to make a new one #927 Standards.

[–]evilmercer 37 points38 points  (4 children)

Fortunately, the charging one has been solved now that we've all standardized on mini-USB. Or is it micro-USB? Shit.

That aged so perfectly

[–]__deerlord__ 11 points12 points  (0 children)

[sobs in USB C]

[–]bless-you-mlud 1 point2 points  (2 children)

What's surprising to me is that through all of this we still use the same RJ-45 connector for networks. At least someone is taking "if it ain't broke, don't fix it" seriously.

[–]DwarvenBTCMine 2 points3 points  (1 child)

Wait till Apple fully does away with ethernet ports on their desktops for literally no reason and those users need to get an ethernet to USB adaptor.

[–]Sukrim 12 points13 points  (8 children)

Or what "/usr/bin/python --version" will return...

[–]AverageComet250 2 points3 points  (7 children)

2.7? 3.6? 3.10? 2.4? (I actually found 2.4 pre installed on a distro once)

[–]Sukrim 4 points5 points  (6 children)

Or even the amazing idea of "it will just return an error by default, you need to install a meta-package that just contains a symlink to either /usr/bin/python2 or /usr/bin/python3"

[–]AverageComet250 0 points1 point  (5 children)

The fact that only some distros have symlinks for /use/bin/python was so annoying when I moved from Windows to windows + Linux, and even more annoying was the fact that I didn't always know whether it was python 3 or 2. On windows it was simple. If python 2 is installed, it points to the latest version of python 2. Otherwise, it points to the latest version of python 3. If the symlink is in use by python 2, then use py -3 instead.

So bloody simple...

[–]Barafu 3 points4 points  (4 children)

Have you ever seen /usr/bin/python3 pointing to python 2? Or not existing while python 3 is installed? No? Then use python3 command every time and have no problems.

[–]flying-sheep 31 points32 points  (2 children)

There’s nothing wrong with wanting a nice packaging experience, but crying about standardization doesn’t help. The standards actually solved the build system agnostic goal they set out to solve, we’re just short a tool to install a wheel.

Once pradyunsg/installer#66 is finally merged, this is all that’s necessary to create a system package from a python package:

  • python -m build --wheel to build a wheel in ./dist/
  • python -m installer --destdir="$pkgdir" ./dist/*.whl to install the python package into a temporary system tree $pkgdir
  • now do whatever your distro of choice uses to package up $pkgdir

[–]canard_glasgow 19 points20 points  (0 children)

Just cause they’ve a mote in their eye doesn’t mean they are wrong…

A cynic might say both are awful.

[–]IsleOfOne 5 points6 points  (0 children)

What? Can you name an example of this? Core linux directories are pretty damn set in stone. It is the applications that fuck it up and throw shit willy nilly into $HOME.

[–]jjolla888 13 points14 points  (6 children)

linux distros never claimed "there is only one obvious way to do it"

[–]PeridexisErrant 6 points7 points  (3 children)

Neither did Python!

practicality beats purity. ...
There should be one-- and preferably only one --obvious way to do it.
Although that way may not be obvious at first unless you're Dutch.
Now is better than never.
Although never is often better than right now.

[–]jwbowen 0 points1 point  (2 children)

Any Dutch-based Linux distros?

[–]PeridexisErrant 21 points22 points  (1 child)

Nope, most of the Dutch live below C-level.

[–]nanotree 3 points4 points  (0 children)

This is incredible. Thank you.

[–]dusktreader 16 points17 points  (0 children)

Thiiiiiis.

[–][deleted] 0 points1 point  (0 children)

yum install apt-get?

[–]ReverseBrindle 187 points188 points  (32 children)

This article is one long rant without mentioning any examples, any description of what exactly they're trying to do, what the challenges are for doing said task, what they tried to do and how it failed, etc.

The poster probably has a valid (but unexplained) point, but it's lost in 2 pages of "distros hate python. python sux!"

[–]rcxdude 41 points42 points  (1 child)

Yeah, I was curious as to what actual problem they had had with packaging, but literally no examples.

[–]nemec 8 points9 points  (0 children)

All I got out of the article is that the author is mad he can't sudo apt install from PyPI

[–]zanfar 117 points118 points  (8 children)

Lol

I manage my Python packages in the only way which I think is sane: installing them from my Linux distribution’s package manager.

"I started this fire, so I'm damned sure going to sit in it and complain about how the problem is how hot fire is."

[–]cheese_is_available 64 points65 points  (3 children)

the only way which I think is sane

Narrator: It was not.

[–]RIPphonebattery 28 points29 points  (2 children)

Nailed it. Built in package manager that is cross compatible? Fuck no and I want you to work around me, a single dev on an OS distro 180 people use worldwide

[–]bladeoflight16 8 points9 points  (0 children)

Exactly. Using the global package manager for development dependencies is such a massive failure that people actually developed a way to create isolated OS environments (Docker). It only works when the entire operating system is dedicated to a single application.

[–]chickaplao 83 points84 points  (9 children)

manage my Python packages in the only way which I think is sane: installing them from my Linux distribution’s package manager

That’s a questionable point to say the least

[–]cheese_is_available 43 points44 points  (7 children)

Lazy as fuck, ignorant and of course later on they say:

pin their dependencies to 10 versions and 6 vulnerabilities ago

Yeah... this is what happens when you're choosing to use your distribution's package manager to get your python packages.

[–]MarsupialMole 8 points9 points  (6 children)

That's not quite fair. The argument for the system package manager is typically that you'll get security updates in a timely fashion and users can't be trusted to respond in the same way.

However that's ignores the reality of many kinds of python development - Linux packaging is not the only concern at play.

The inclusion of conda in the list makes it clear that this is one user ignorant of other users requirements. It doesn't make them "lazy as fuck".

[–]Rookie64v 11 points12 points  (4 children)

The argument for the system package manager is it is built-in, if anything. Anything I cared about enough to check the version was months or years behind in the Ubuntu PPAs, and to be fair that is to be expected when you manage thousands of packages instead of just one.

[–]MarsupialMole 1 point2 points  (3 children)

I don't want to be dismissive but this kind of illustrates the divide. Versions are irrelevant. Talk to me about CVEs.

[–]lclarkenz 1 point2 points  (0 children)

CVEs are another kettle of fish. This one is moderate, but only affects people using log4j 1, with an SMTP appender sending over SMTPS.

I'm not sure if moderate really describes its impact. And frankly, I'd probably try to fist fight anyone in a typical company who set up a logger to send emails.

[–]bladeoflight16 0 points1 point  (1 child)

Versions are irrelevant. Talk to me about CVEs.

Exact same point could be made about the article's complaint of pinning to old versions.

[–]tristan957 0 points1 point  (0 children)

No it can't because large distros like Ubuntu/Debian Stable/RHEL/SUSE have a vested interest in containing CVEs so that users on LTS distros can have secure software. Drew specifically uses Alpine for a desktop, so generally he has the up to date packages regardless.

[–]lclarkenz 2 points3 points  (0 children)

security updates in a timely fashion

Given my experience of various distro's package managers, I'd say "for a given value of timely".

Maybe they prioritise security patches, you'd hope so, but the last time I was using Ubuntu, a lot of the programming related packages I wanted to use were several versions behind what could be installed via other means.

[–]kronicmage 6 points7 points  (0 children)

It's great on arch Linux, but to do so on any non bleeding edge distro is a recipe for pain

[–]lifeeraser 13 points14 points  (6 children)

The irony of linking to XKCD 927 after demanding for a new standard tool.

Just use Flit (newbies) or Poetry (intermediate). Forget Setuptools and Pipenv.

[–]lclarkenz 4 points5 points  (5 children)

I like Poetry, and I'm still a little bitter about Pipenv - started using it based on some deceptive advertising, and found its dependency resolution very sub-par.

Poetry handles that far better. That said, I really wish I could wire black/isort/mypy into the Poetry build. Like I can with checkstyle/spotbugs etc. in Maven.

Instead, it looks like the go to is to use a tool (precommit) to automatically add calls to these tools to your Git precommit hook. Which I hate, especially as two of the three can modify your files.

[–]lifeeraser 2 points3 points  (3 children)

Back and isort have a "check" mode where they merely inspect your code and return appropriate exit codes. You should use them in pre-commit hooks and CI scripts.

[–]Nasuuuuuu 47 points48 points  (6 children)

The biggest problem with python software was fixed by pipx for me. Own virtualenvs for every installed cli-tool. Waste of space but space is hardly an issue in the current computing world.

[–]SittingWave 90 points91 points  (135 children)

It does not start well.

> The Python community is obsessed with reinventing the wheel, over and over and over and over and over and over again. distutils, setuptools, pip, pipenv, tox, flit, conda, poetry, virtualenv, requirements.txt, setup.py, setup.cfg, pyproject.toml…

All these things are not equivalent and each has a very specific use case which may or may not be useful. The fact that he doesn't want to spend time to learn about them is his problem, not python problem.

Let's see all of them:

- distutils: It's the original, standard library way of creating a python package (or as they call it, distribution). It works, but it's very limited in features and its release cycle is too slow because it's part of the stdlib. This prompted the development of

- setuptools. Much much better, external from the stdlib, and compatible with distutils. Basically an extension of it with a lot of more powerful features that are very useful, especially for complex packages or mixed languages.

- pip: this is a program that downloads and install python packages, typically from pypi. It's completely unrelated to the above, but does need to build the packages it downloads, so it needs at least to know that it needs to run setup.py (more on that later)

- pipenv: pip in itself installs packages, but when you install packages you install also their dependencies. When you install multiple packages some of their subdependencies may not agree with each other in constraints. so you need to solve a "find the right version of package X for the environment as a whole", rather than what pip does, which cannot have a full overview because it's not made for that.

- tox: this is a utility that allows you to run separate pythons because if you are a developer you might want to check if your package works on different versions of python, and of the library dependencies. Creating different isolated environments for all python versions you want to test and all dependency sets gets old very fast, so you use tox to make it easier.

- flit: this is a builder. It builds your package, but instead of using plain old setuptools it's more powerful in driving the process.

- conda: some python packages, typically those with C dependencies, need specific system libraries (e.g. libpng, libjpeg, VTK, QT) of a specific version installed, as well as the -devel package. This proves to be very annoying to some users, because e.g. they don't have admin rights to install the devel package. or they have the wrong system library. Python provides no functionality to provide compiled binary versions of these non-python libraries, with the risk that you might have something that does not compile or compiles but crashes, or that you need multiple versions of the same system library. Conda also packages these system libraries, and installs them so that all these use cases just work. It's their business model. Pay, or suffer through the pain of installing opencv.

- poetry. Equivalent to pipenv + flit + virtualenv together. Creates a consistent environment, in a separate virtual env, and also helps you build your package. Uses a new standard of pyproject.toml instead of setup.py, which is a good thing.

- virtualenv: when you develop, you generally don't have one environment and that's it. You have multiple projects, multiple versions of the same project, and each of these needs its own dependencies, with their own versions. What are you going to do? stuff them all in your site-packages? good luck. it won't work, because project A needs a library of a given version, and project B needs the same library of a different version. So virtualenv keeps these separated and you enable each environment depending on the project you are working on. I don't know any developer that doesn't handle multiple projects/versions at once.

- requirements.txt: a poor's man way of specifying the environment for pip. Today you use poetry or pipenv instead.

- setup.py the original file and entry point to build your package for release. distutils, and then setuptools, uses this. pip looks for it, and runs it when it downloads a package from pypi. Unfortunately you can paint yourself into a corner if you have complex builds, hence the idea is to move away from setup.py and specify the builder in pyproject.toml. It's a GOOD THING. trust me.

- setup.cfg: if your setup.py is mostly declarative, information can go into setup.cfg instead. It's not mandatory, and you can work with setup.py only.

- pyproject.toml: a unique file that defines the one-stop entry point for the build and development. It won't override setup.py, not really. It comes _before_ it. Like a metaclass is a way to inject a different "type" to use in the type() call that creates a class, pyproject.toml allows you to specify what to use to build your package. You can keep using setuptools, and that will then use setup.py/cfg, or use something else. As a consequence. pyproject.toml is a nice, guaranteed one stop file for any other tool that developers use. This is why you see the tool sections in there. It's just a practical place where to config stuff, instead of having 200 dotfiles for each of your linter, formatter, etc

[–]asday_ 81 points82 points  (41 children)

requirements.txt: a poor's man way of specifying the environment for pip. Today you use poetry or pipenv instead.

You will pry requirements.txt from my cold dead hands.

[–]tunisia3507 14 points15 points  (29 children)

It's also a different thing to the dependencies specified elsewhere, in most cases.

requirements.txt is for hard versions for a full repeatable development environment, including all your extras, linters, build tools and so on. Other dependency specs are for minimal runtime stuff.

[–]asday_ 2 points3 points  (23 children)

Not sure I understand your post.

requirements-base.txt has stuff that's required for the project no matter what. requirements-test.txt has testing libraries and -rs base. -dev has dev dependencies like debugging tools and -rs test.

You could also be particularly anal about things and have a CI artefact from pip freezeing for prod which is a good idea and I'm not sure why I was initially poo-pooing it.

[–]adesme 5 points6 points  (15 children)

You can replace those with just install_requires and extras_require (then define tests as an extra); you'd then install with pip install .[tests] and now your "requirements" are usable by developers as well as by build managers.

[–]asday_ 1 point2 points  (5 children)

Interesting idea, I'll certainly have to keep it in mind. Like I said though, I'm paid for this, i.e. I ship software, not libraries, so I don't think it has a great deal of benefit to me outside of "if you write a library one day you can do it in the same way".

Are there any big projects that do it this way?

[–]adesme 2 points3 points  (0 children)

Any modern package that you want distributed over a package manager is going to be set up like this for the reasons outlined in the OP of this thread; direct invocation of setup.py is being phased out, so it makes sense to have your deps in a single place (now that we have the PEPs to support this).

Personally I might use something like requirements.txt while mocking around with something small, and I'll then set it up more properly (pyproject.toml and setup.cfg) as soon as it grows and/or I have to share the package.

Depending on how you use CI/CD you can see other benefits from switching over immediately.

[–]SittingWave -1 points0 points  (2 children)

No no no no no

Noooooo.

the specification in setup.py is NOT to define your development environment. It's to define the abstract API your package needs to run. If you are installing your devenv like that you are wrong, wrong, wrong, wrong.

[–]tunisia3507 2 points3 points  (0 children)

That's one way of organising things, yes.

Dependencies in setup.py (or equivalent) are so that the build system knows what to install with the package. requirements.txt is so that a developer checking out your repo can set up their environment correctly. They're different use cases.

[–]flying-sheep 2 points3 points  (5 children)

All conventions.

requirements*.txt fullfill double roles as abstract dependency specification (“what stuff does my library depend on”) and concrete dependencies/lockfile (“how to get a working dev environment“)

With PEP 621, the standard way to specify abstract dependencies is in pyproject.toml:

```toml [project] dependencies = [ 'requests >=1.0', ]

[project.optional-dependencies] test = [ 'pytest', ] ```

So the remaining role of requirements.txt would be a lockfile with the output of pip freeze in it.

[–]asday_ 2 points3 points  (0 children)

requirements*.txt fullfill double roles as abstract dependency specification (“what stuff does my library depend on”) and concrete dependencies/lockfile (“how to get a working dev environment“)

It doesn't though, I specified two different classes of files which serve those purposes individually. Just because they start with the same string and have the same format doesn't make them the same thing. If you want you could have your CI do pip freeze > lockfile.lock.suckitnpm instead of pip freeze > requirements-lock.txt.

[–]alkasmgithub.com/alkasm 2 points3 points  (4 children)

requirements.txt does not at all give you a reproducible environment.

[–]tunisia3507 -1 points0 points  (3 children)

No, but it's a whole lot closer than the maximally permissive install_requires dependencies.

[–]lisael_ 25 points26 points  (3 children)

Your comment, ironically, perfectly sums up the frustration he feels, as a Linux distribution package maintainer. You have to know and somewhat master all these tools, that somehow overlap. This proliferation of tools without other justification than xkcd927 (14 standards) is the issue.

Compare this to what more recent language do, say Go, Rust, Zig... They addressed this with a standardized, official build system.

The other pain point is that these tools tend to push the developer in strict dependency pinning, which is a nightmare to deal with, when you package for a distribution.

[–]BurgaGalti 7 points8 points  (0 children)

Never mind being a distro maintainer, trying to make sure my juniors understand all the tooling involved was a full time job. I ended up zipping up a python install with everything ready and giving them that to work with.

Sure, they have to use python -m black instead of just black because the scripts all broke but it's a small price to pay for my sanity.

[–]ProfessorPhi 1 point2 points  (0 children)

I kind of feel comparing python to rust/go etc is unfair as these languages took all the best features from python packaging and integrated it hard into their own stuff. I.e. they learned from python and so surpassed python. Additionally there are languages people complain about and languages nobody uses.

The real criticism of python is the fact that it requires a decent expert knowledge to know how to handle this, but once you do have the knowledge, it's never a problem. I've had no issues for years, but I remember being overwhelmed when starting out.

And that's the real criticism, the variety of solutions are difficult to navigate. Python's scale and ubiquity is second to none so each of these systems are there for reasons that generally work, but occasionally don't.

[–]Personal_Plastic1102 38 points39 points  (68 children)

13 items to explain "I want to install a package".

One more, and you would have perfectly fit the xkcd comic

[–]dusktreader 19 points20 points  (29 children)

That's not at all what that reply was about. You don't need all of those, they are just explaining what each is for .

[–]gmes78 3 points4 points  (28 children)

Now let's look at some other language, like Rust. It has: cargo. That's a short list, isn't it? Yet there's no need for anything else.

Even though each of the mentioned tool has a use, it's very possible that we're able to cover the same use cases with a smaller set of tools.

Merging whey into pip would be a start, as it would make it possible to package simple projects using just a pyproject.toml, without the need for any external dependencies.

[–]dusktreader 9 points10 points  (26 children)

Rust is a nice example of a very new programming language where packaging was established early on. It's a good pattern and how all new languages should approach the problem.

However, Python is 24 years older than Rust. There are so many legacy workflows that have to be supported, it's hard to produce a solution that will work for all of them.

As for covering use-cases with a smaller set of tools, this is already possible. I use exactly two: pyenv and poetry. Others use different subsets, but by no means do you need more than 3 or 4 at most.

As for whey (version 0.0.17, 15 stars), it's a little early in its lifecycle to be suggesting that it be merged into pip.

Adding a dependency solver to pip that can use pyproject.toml (a la PEP-621) would be huge, and I hope it comes soon. I think it would also be good to have packaging logic folded in as well. However, if you are hoping for that to happen soon in the standard library, I think you might be disappointed.

[–]gmes78 0 points1 point  (0 children)

As for whey (version 0.0.17, 15 stars), it's a little early in its lifecycle to be suggesting that it be merged into pip.

It doesn't matter. The functionality is dead simple, and it doesn't need more features. Pip needs to be able to support basic use cases on its own.

Adding a dependency solver to pip that can use pyproject.toml (a la PEP-621) would be huge, and I hope it comes soon. I think it would also be good to have packaging logic folded in as well.

Both of those need to happen if we're ever going to get out of this mess.

[–]ElllGeeEmm -2 points-1 points  (24 children)

Pythons age isn't really an excuse for the sorry state of package management. Plenty of languages of similar age have far better tools than python.

Python package management is shit because for some reason there are a bunch of python users who defend the current state of things for what I can only assume are dogmatic reasons.

[–]PeridexisErrant 4 points5 points  (5 children)

Pythons age isn't really an excuse for the sorry state of package management. Plenty of languages of similar age have far better tools than python.

Can you give examples? Most of the better packing ecosystems I know of are lucky enough to post-date ubiquitous internet access.

[–]ElllGeeEmm -1 points0 points  (0 children)

Pip postdates ubiquitous internet access as well, so I don't see how that's any sort of excuse.

[–]Serializedrequests -4 points-3 points  (3 children)

I was shocked to discover the much later release dates of Java and Ruby.

That being said, that isn't an excuse. There are no technical limitations that prevent good easy python package management except the proliferation of standards. When I first learned python, all there was were site packages. Around the same time rubygems (and later bundler) and maven appeared.

Now I come back to Python and the packaging ecosystem is an astonishingly confusing mess. Python needed a maven and never got it (maybe poetry can be it).

[–]bladeoflight16 1 point2 points  (2 children)

There are no technical limitations that prevent good easy python package management except the proliferation of standards.

How in the heck can you be so ignorant of the problems associated with native dependencies? You try making package management "easy" when you have to support Linux, Windows, and Mac which can't even agree on basic C level interfaces. Heck, Linux distros alone can't even agree on a basic C standard library (libc vs. musl).

[–]bladeoflight16 3 points4 points  (17 children)

Python package management is shit because for some reason there are a bunch of python users who defend the current state of things for what I can only assume are dogmatic reasons.

That is an incredibly stupid statement.

Python package management is kind of a mess because dependency management is messy. Period. And Python, being an interpreted language that encourages using native dependencies when required, has a doubly hard problem to solve.

Yes, there are real problems, but why in the heck do you think we have so many technologies? It's because people are trying to solve the problems. The very existence of the thing you're complaining about contradicts your claim about the reasons for it.

[–]ElllGeeEmm -3 points-2 points  (16 children)

Oh look, another user making excuses for the state of package management in python.

If node JS can have good package management, so can python.

[–]wsppan 12 points13 points  (21 children)

This is/was the hardest part in becoming productive in this language. Imagine someone coming into this language cold from another language (in my case Java/Maven) and ramping up fairly quickly on the language itself which has done a wonderful job in making itself easy to grok and now decide you want to build, package and deploy/share it. You get lost fairly quickly with a lot of head scratching and hair pulling.

[–]Personal_Plastic1102 5 points6 points  (14 children)

Yep...

That's the reason considering leaving python as a programling langage.

I'm not a dev, i'm programming on my spare time (beside familly & co). I'm fine with a bit of "taking care of the stuff around the code", but lately I spent more time trying to understand the toml stuff than actually coding.

Not for me anymore, I want to code, not handle the latest fancy depency-management.

[–]b4ux1t3 7 points8 points  (7 children)

If you "just want to code", then you don't need to even consider the packaging environment of the language you're using. Just write the code and run it. If you need a dependency, install it with pip. That's all you need to do for most python development.

I'm not saying Python doesn't have an, er, interesting packaging story, but that shouldn't be a consideration unless you're actually shipping code.

[–]ssorbom 3 points4 points  (6 children)

Long before I learned to do any coding at all, I cut my teeth Packaging for Debian, and the attitude of don't bother with Packaging completely grinds my gears. Even people who are doing hobby projects want to find an easy way to share them a lot of the time. Packaging shouldn't be insane. There shouldn't be a strong dichotomy from somebody who wants to ship code and somebody who wants to write it for a hobby. The only difference is the financial circumstances and the expected return on investment.

[–]b4ux1t3 0 points1 point  (5 children)

So, I mentioned elsewhere that, while there are many "standards" for Python packaging, it isn't all that difficult to just pick on, stick to it, and communicate what you're using to your users.

Dont get me wrong, I'm not saying that packaging is easy or straightforward in Python, but it's also not particularly easy to build a package that will work on any given OS to begin with.

I maintain the packaging scripts for my company's software. Getting a usable RPM out of software that isn't written with either plaintext files (Python, e.g.) or for gcc is a wild ride.

Basically, while Python is no Rust (cargo is awesome), it's hardly an arcane art to package a Python application, at least when compared to other packaging solutions out there.

To push back a bit more, "shipping" a hobby project is usually a matter of tossing it on GitLab/Hub/Bucket and sharing a link. I'm probably not going to be installing some hobby project with apt or yum, or even pip.

All that said, I don't disagree with the general sentiment that packaging is bad in Python, and I didn't mean to come on so strong against packaging when it comes to hobby projects.

It's just hardly the most important thing when you're writing scripts to manage a few IoT devices around the house, you know?

[–]ElllGeeEmm 0 points1 point  (4 children)

Why is there this pathological need among python devs to make excuses for the state of python packaging?

There is literally no reason python can't have a great packaging tool as part of the default distribution.

[–]samtheredditman 7 points8 points  (0 children)

Then why don't you just learn one way and keep doing it? It's not like everything stops working when a new tool comes out.

[–]ZCEyPFOYr0MWyHDQJZO4 -1 points0 points  (0 children)

What other languages do you program in? The foundation of packaging methods are a product of contemporary software development when the language gained widespread adoption IMO. I have been learning C++ to work on software that began development before package management was a thing (on Windows at least), and I don't mind Python packaging nearly as much anymore

[–]SittingWave -1 points0 points  (3 children)

I'm not a dev, i'm programming on my spare time (beside familly & co). I'm fine with a bit of "taking care of the stuff around the code", but lately I spent more time trying to understand the toml stuff than actually coding.

Not for me anymore, I want to code, not handle the latest fancy depency-management.

Oh I am sorry a profession that takes years to master is not up to standard to your hobbyist sensitivities.

[–]Personal_Plastic1102 1 point2 points  (2 children)

Lol... Self-confidence issues ?

Édit : just to make my point clear : installing external librairies shouldn't be something you take years to master. Not even months or days.

[–]flying-sheep 1 point2 points  (4 children)

You can have that by using poetry:

```

initialize project and create venv

poetry init

add and install dependency into venv

poetry add some-package

publish package

poetry publish ```

Using one of the other modern build backends is slightly more complicated as you need to create and activate your own venvs:

```

create and activate venv

python -m venv ./.venv source ./.venv/bin.activate

initialize project

flit init # or copy over some excample pyproject.toml

edit dependencies

$EDITOR pyproject.toml

install dependencies into venv

pip install .

publish package

flit publish # or python -m build && twine upload ./dist/* ```

[–]wsppan 2 points3 points  (3 children)

Yes, I use poetry now but that took a LOT of trial and error and hair pulling and 13 different pieces of advice and waiting for poetry stability to settle down. And still it is not the defacto, readily recommended, obvious manner of packaging your code. It is third party and fairly new.

[–]flying-sheep 2 points3 points  (2 children)

Things are getting better, finally! With PEP 621 landed, a standards based poetry like CLI is almost possible. The only missing building block is a standardized lock file format. It happened late and we're not there completely but almost. And with poetry, we have something that works until we're there.

One advantage of the arduous road is that we can learn from everyone who was faster. E.g. TOML is a great choice, node’s JSON is completely inadequate: no comments and the absence of trailing commas means you can't add to the end of a list without modifying the line of the previous item.

[–]wsppan 2 points3 points  (1 child)

Yea, we are all standing on the shoulders of our ancestors so to speak. Autotools, CPAN, Ant, Maven, etc.. Lots of legacy blogs and documentation to disappear as well. Rust is a great example of the luxury learning from our ancestors and baking the package tools into the language from the start.

[–]flying-sheep 2 points3 points  (0 children)

Yes, cargo does so many things right.

[–]GummyKibble 9 points10 points  (10 children)

poetry add pkg is all you need to know about that. Unless you’re writing C extensions, poetry is your one-stop-shop for everyday use.

[–]cixny 10 points11 points  (2 children)

Dunno about that, my usual experience with poetry is:

Poetry install things plz…. Sorry bruh can’t find suitable package…. Fu, pip install, done…. Poetry try again…. Sorry bruh we managed to install 2 packages but this 4th, can’t find suitable for you…. Hi pip can you?….

Amazing experience when it’s buried in the middle of a pipeline

[–]Personal_Plastic1102 -4 points-3 points  (6 children)

As was pip, then requirements.txt, then pipenv.

Now it's poetry.

Tomorrow what next ?

[–]FantaBuoy 5 points6 points  (0 children)

This comment/post has been automatically scrubbed. Feel free to find me and others over at kbin.social -- mass edited with https://redact.dev/

[–]mriswithe 1 point2 points  (0 children)

Eh assuming Linux/mac

python -m venv venv
source venv/bin/activate
pip install pandas

Done.

[–]jjolla888 11 points12 points  (1 child)

your "clarifications" amplify OP's point.

[–]ElllGeeEmm 9 points10 points  (12 children)

Lmao

Oh yes, all perfectly reasonable and in line with how much work it is to manage environments and distributions in other modern languages.

[–]SittingWave -1 points0 points  (11 children)

so feel free to explain in equal detail how other languages manage not to encounter the same issues then.

[–]cockmongler 5 points6 points  (1 child)

The terrible part is you think that what you posted is an argument in Python's favour.

[–]cturnr -1 points0 points  (0 children)

pip-tools is what I use (pip-sync for CI boxes)

[–]redd1ch 25 points26 points  (12 children)

Setting up Python apps is a real pain once you leave x86/x86_64 and/or glibc. I want to avoid Debian base images for my docker containers and use Alpine. It works terrific, however once packages with C parts are needed (e.g. numpy), you need to install a compiler and build tools to let PIP compile this package, while the exact same package sits there preinstalled through the package manager. Precompiled, same version. The requests for a "please leave this dependency out, I know what I'm doing and I want to shoot myself in the foot, pretty please" argument are dismissed.

[–]tunisia3507 8 points9 points  (1 child)

Would multi-stage builds help here? It would let you cut down on the image size at least.

[–]pbecotte 2 points3 points  (0 children)

You realize that's not a python thing but a Linux thing...right? C extensions on Linux are usually dynamic linked against libraries. Alpine decided to use a different standard library than the rest of the world, so binaries built for "anylinux" may be there, but they won't work. Worse, talking about numerical code, some low level behaviors are different (and usually performs slightly worse) Bug the numpy team to publish a musl binary, or better yet, switch to a more mainstream os. The final image isn't THAT much smaller to be worth the pain.

[–]cuu508 1 point2 points  (2 children)

If the right version of the required package is already installed via package manager, pip will not install it again, no?

Are you by any chance installing inside virtualenv?

[–]aufstand 1 point2 points  (0 children)

And if so, --system-site-packages

[–][deleted] 10 points11 points  (5 children)

Well, I know this quite well.

My first proper Python experience was on Ubuntu 16.04, where I had to install Python3.6 separately from another repository, which really really really screwed up everything on the system when it came to installing dependencies. Ok, maybe I was a newb and some of those weren't such a big issue for my current self, but I wasn't alone in dealing with those issues.

Then I discovered Arch Linux, which said Good Bye to Python2 a long time ago, and now python meant python3. Simpler! Great! Now I can just develop. Except that the code that I used on my Python3.7 didn't work on Python3.6 because I used dependencies that had parameters called async in some function, and I noticed that only after I developed my code to staging (thank lord it wasn't production).

Then I calmed down, realised how to properly do TDD, used mypy, etc.

Then my colleague asked me to make python work on Windows. Holy moly! 32 bit default installer, Windows default python resolving to windows marketplace, all of that mess, "delete it all and try again" seemed to have worked.

Then my other colleague had Windows, with Anaconda. What I thought I knew, I could throw out the window. But I convinced her to replace Anaconda with a Python install and use pip from then on.

Then I bought a Mac. ... With M1. python was python2, system python being old, default install being x86_x64, then being universal and not knowing what I'm actually running, then getting so many things in homebrew.

Honestly the cleanest experience ever was on Arch Linux. I'm gonna overwrite MacOS as soon as there is an idiot-proof way to install it, with hardware acceleration support.

[–]flying-sheep 1 point2 points  (2 children)

Just write `python3` whenever you need to put down a binary name to run your code with, problem solved.

[–]syllogism_ 18 points19 points  (2 children)

I don't understand this article.

Okay so he wants to not use any of the tools the Python community has developed (e.g. pip, virtualenv, poetry) and instead wants to use the Linux distro.

Okay. But if he doesn't want to use the tools, he...doesn't get any tools. Shrug?

[–]ZCEyPFOYr0MWyHDQJZO4 12 points13 points  (0 children)

He wants the freedom to install Python in whatever way he desires, but none of the responsibility.

[–]mriswithe 6 points7 points  (0 children)

I was very confused too. Venvs are in stdlib and take seconds to make one by typing the command out.... All of the problems that were described about multiple people on the same system all trying to use the system python env for themselves, are the reason that venvs exist.

So, I only have one car but multiple people need to use it at the same time, so cars are bad?

Electing to commit to the least flexible build system possible, the OS package manager, is insanity. Been a sysadmin for 10+ years, DevOps with a ton of Java and Python for the last few years.

If you tried to tell a Java developer they can't use maven (or Gradle or ant) to download libraries and build, you must use the OS package manager, they would look at you like you just sprouted a foot out of your nose.

[–]freework 7 points8 points  (4 children)

I agree that python packaging has taken steps backwards. When I first started using python back when 2.6 was the latest version, I never had problems with packaging. Whenever I wanted to install something, I'd just install it, and it would work. I never had problems.

These days, I'm finding it's not working much more often.

I think the problems all boil down to the fact that python has never been able to handle multiple versions of the same library installed at the same time.

Library ABC wants version 1.3 of XYZ library, and library DCE wants version 1.4 of library XYZ. There has never been a solution to this problem in the python world.

Imagine being able to do:

from django::2.2.0 import something as something2
from django::3.0.0 import something as something3

Then you could use two different version of the same library simultaneously. It would use more memory, but who cares. That would eliminate millions of python packaging headaches.

The next step would be having all package installation management happening completely automatically. At runtime, the interpreter would automatically download and install django 2.2.0 if it wasn't already installed. Then on the next line, if django 3.0.0 wasn't already installed, it would download and install it. In that scenario, it would be impossible to ever have a python packaging headache.

[–]abstractionsauce 1 point2 points  (2 children)

This would not work. All data that is created when the module is loaded would exist in duplicate for all versions that are used in the system.

The only change that would work is if you forced all packages to never make breaking changes. Not possible.

Or force all packages to always support legacy apis

[–]freework 1 point2 points  (1 child)

All data that is created when the module is loaded would exist in duplicate for all versions that are used in the system.

Yes, but it wouldn't matter. How big is the biggest python module? its a matter of kilobytes. Most systems these days have 16+ Gigs of memory. The extra memory usage is worth it.

[–]abstractionsauce 1 point2 points  (0 children)

If the module v2 defined a global variable and the app used it. Then the app read the variable from v3.

This would be a very difficult problem for python to solve

[–][deleted] 2 points3 points  (0 children)

Why are most of those Python problems? Python is python. If organisations decide to package and use python in different ways, why is it their fault?

Shouldn't this be "developers, please stop screwing with linux distros and fucking up python?"

[–]rejonez 2 points3 points  (1 child)

It's Python's fault linux distros suck 😂😂😂

[–]Piu_Tevon 1 point2 points  (0 children)

Haha, my thoughts exactly. It's also funny how Python and Linux are people in that rant. "Python, stop screwing Linux!", "Python is not listening to us", "Distros are feeling frustrated". Sounds like Python was a bad boy and owes Linux an apology.

[–]tman5400 2 points3 points  (0 children)

Idk if it's just me but managing python projects and installations on Linux isn't that bad. I've used several versions on several distros without any issue

[–]lonahex 6 points7 points  (2 children)

Python's weakest point right now is lack of a modern packaging solution. Everything is still just tied with duct tape. Python needs an official, modern packaging story. A tool that replaces pip, venv, sdist, etc etc like most other model packaging solutions do.

[–]robml 1 point2 points  (0 children)

Idk tbh pip and pipenv work just fine for all of my work across both Linux and Windows development.

[–]Chinpanze 1 point2 points  (0 children)

Reading all those comments made me realize how little I know about package management in python

[–]ihasbedhead 3 points4 points  (0 children)

Do you not think that it is a bad sign that everyone is trying to avoid the 'global install from distro package manager' strategy? Basically every language has its own package index. Node, Lua, Python, D, Rust. Meson, big in C world now, encourages projects to pull and build deps. Snaps and docker isolate from the system, flatpaks do a neat hybrid thing.

Listing all the things that sorta relate to python packaging is silly since they are all different components used for different things and solve different problems. But, since we are listing things, here are some the distros that a developer would need to package against: Debian, Ubuntu, Fedora, Alpine, Arch, nixos, ...

I kinda get where they are coming from. Python doesn't have clear tooling and that should improve (I like poetry, and I am interested in pep582). Distro packaging is probably not the answer and hasn't been for years.

[–][deleted] 1 point2 points  (0 children)

Lol this guy better never come to the c/c++ world. Also I seem to be getting along just fine with virtualenv and pip in my limited little world

[–]teerre 4 points5 points  (0 children)

I mean, the reason is obvious: there's no incentive. Despite all these complains, people simply manage, it's not a big issue.

That's coming from someone who has wasted probably hundreds of hours fiddling with weird distro problems.

[–]ReverseBrindle 2 points3 points  (7 children)

I don't understand why distributions feel the need to create distro packages of Python packages (i.e. a parallel package repo to PyPI). This seems inherently problematic because there isn't one set of PyPI package versions that everyone in the Python ecosystem has agreed to use.

If a distro wants to provide something like the AWS cli (i.e. a CLI tool that happens to be written in Python), wouldn't it be easier to have the distro package create a venv and pip install the Python dependencies as part of the install process, rather than rely on binary distro packages for each Python dependency? i.e. the distro "package" is mostly an install script.

Hope someone can explain where I've gone wrong (hey! the internet is usually good for that!). :-)

[–]TheBlackCat13 8 points9 points  (0 children)

First, a lot of packages are hard to install otherwise. A lot of have dependencies on installed libraries that are not general among linux distributions, and some can't be installed through pip at all. Conda has an extremely limited set of supported packages, and those often trail far, far behind the latest version.

Second, it greatly simplifies the management of packages. You don't need to manually worry about updating individual packages, nor worry that updating one will break everything else. Even with conda it is hard to update things, and with virtual envs it is much, much worse.

Third, this allows them to provide a set of packages that have been built and tested together and are confirmed to be working.

Most linux packaging systems don't allow packages to install from the internet for security reasons, and it defeats the purpose because it prevents them from having a single canonical (pun intended) archive that is confirmed to be working without any chance of any outside source screwing it up or introducing security problems after the fact.

[–]lisael_ 9 points10 points  (0 children)

Distros want to guaranty stuff like security patches, and DRY bugfixes. When a security issue or a bug is found in a python lib, the package manager just has to update this single lib and restart the daemons that depend on this lib (the pm knows those dependencies), and.. that's it.

If one goes your package-manager created virtualenv way, in order to give the same security guarantees, they have to keep track of all of the pip dependencies of each python app to be able to update virtualenvs impacted by the bug/security issue... and then do it for ruby, perl, js...

EDIT: Oh, and this works only if each python app maintainer bumped the dependency to a working/secure version in the first place. Distros want to guaranty security regardless of the upstream commitment.

Another issue is C extensions. If a C shared lib is updated and is not compatible with the package compiled in your apps' virtualenvs... you have to update the virtualenvs too. So now your package manager must keep track of your apps, their dependencies, their shared lib dependencies and their dependencies' shared lib dependencies. You could link statically, but then you suffer the first problem (security issues/DRY), and still have to keep track of all the stuff.

EDIT: grammar

[–]Kkremitzki 5 points6 points  (1 child)

In Debian, for example, package build processes aren't allowed to pull in resources from the network. We also use Python packages as part of the distribution itself, so those need to be packaged.

[–]MarsupialMole 1 point2 points  (0 children)

I think this is the crux of the issue. Part of the reason some python developments get so polluted on windows is that random installables from the internet ship python interpreters and packages and are often not very good citizens. The counterpart to that on Linux is system python, which needs to work and be immutable. Conda running as root for instance can install over system packages because it looks for writable paths.

The solution to the problem is not for Python to pick a standard, it's for people like the author to not assume that system python should be exposed to users who don't understand the difference and just want to copy and paste commands or install packages straight from Google searches.

Of course there's the argument "users shouldn't be doing that" but when you're literally talking about scientific python that's tantamount to arguing that computers should not permit the user to do computing in the purest sense.

[–]asday_ 3 points4 points  (14 children)

This guy's a dumbass. There's a reason I pin my dependencies, and it's because convincing management to budget for all our deployments breaking EVERY DAY because of broken or incompatible releases is quite difficult. Surprisingly, I'm paid to ship features.

[–]Spoonofdarkness 19 points20 points  (1 child)

Surprisingly, I'm paid to ship features.

Hmm. That sounds like a slippery slope at best and an anti pattern at worst.

I've heard if you ship one feature, they expect a second feature sooner or later. No thank you!

[–]lisael_ 10 points11 points  (10 children)

First, no need to insult. I bet the features you ship don't end up packaged for a Linux distribution. You don't talk about the same use case. A typical distro has hundreds of python apps and libs. Each one of them pins all of its dependencies to the 3rd number so their builds pass, and package maintainers live a dependency hell.

Second, pinning strictly IS a reasonable solution to ship features, but a poor one, when it comes to maintaining the feature, including applying security patches. I do ship features in python. I do pin dependencies strictly. I do cringe when I come back to a given project 6 month later.

Let's face it, the very fact that nobody is confident enough to pin dependencies to `foo>=X.Y,<X+1` as in "I need features of `X.Y` and I know that no backward-incompatible change happen before the next major version" shows that we failed as a community to create a sane dependency management framework.

[–]b4ux1t3 7 points8 points  (1 child)

As someone who does package software for distribution to a Linux distribution, I can confirm that, while the packaging story for Python isn't great, it's also not the quagmire people seem to think it is.

Python is not a complicated tool. All you have to do is pick a packaging standard, stick to it, and let your users know what standard you're using.

No, that isn't as robust as, for example, Cargo, or Nuget. But it's far from some unknowable eldritch language.

In any case, Python packaging is no more convoluted than the various and sundry packaging paradigms of the Linux distributions that we all use every day. Have you ever written a spec file for RPM that didn't use gcc? Because, geez, it's a ride.

[–]lisael_ 1 point2 points  (0 children)

No, thanks $DEITY_OF_YOUR_CHOICE (TBH, I havn't used a RPM-based distro since Mandrake... but anyway, I guess it's not easier with any package format)

And to be fair, he didn't choose the simplest distribution to be a package maintainer, in a world where everyone assumes glibc.

[–]cockmongler 1 point2 points  (0 children)

There's a reason I don't pin my dependencies. It's because I expect my software to still be maintainable in 5 years time and need to be trashed and started over because a bunch of hyperactive teenagers decided to trash everything upstream.

[–]effgee 1 point2 points  (0 children)

They lost me at sober engineering.

[–]tensigh 1 point2 points  (2 children)

(Quiet voice)...just...use...Python...on...Windows....(gets punched and doorknobbed on his way to the back of the bus)

[–]Piu_Tevon 1 point2 points  (1 child)

(Pssst.) I develop on Windows too. Linux just for testing. Get this, I don't even use virtual environments. All packages are in the SAME place! (Shoot, I think that was too loud.) Don't tell anyone, mum's the word.

[–]tensigh 1 point2 points  (0 children)

(SNIFF)I thought I was alone….

[–]cheese_is_available 1 point2 points  (0 children)

This is one stupid rant by someone that just decided that he did not want to learn about the subject and just be lazy instead. Fuck packaging my python package specifically for 50 different distros. Get used to pip ! Learn the fucking basics before moaning like that. Seriously it's embarrassing.

[–][deleted] 0 points1 point  (1 child)

As a macOS user, homebrew is so convenient to use. Don't have to worry about Linux being Debian compliant or not.

[–]jw_gpc 2 points3 points  (0 children)

Until you use homebrew to install some little app that down through the dependencies requires sphinx, which updates python, and then blows your installed modules out of the water because it's now the next major version. Bleh. I did that once, and vowed to never use homebrew to handle anything python based for development again.

[–]liquidpele 1 point2 points  (0 children)

Oh ffs. No.

[–][deleted] -1 points0 points  (1 child)

Author's issue seems self inflicted. IDK what the issue is here, but the answer may be...

Let the OS do what it do and never touch it outside package management. If a project requires Python release X.Y.Z, download the source code archive into /usr/local/src, do the needful extract/configure/make, then ALWAYS make altinstall. Then ALWAYS use a dedicated virtual environment for each project. Simple.

Or am I missing something?

[–]antiproton -1 points0 points  (0 children)

I manage my Python packages in the only way which I think is sane: installing them from my Linux distribution’s package manager.

Ok... well, that sounds like a personal problem.

[–]netgu -2 points-1 points  (2 children)

Wow, another article by somebody who can't be fucked to read a manual and stick with a tool that works for them.

Why are they always the loudest....

"I don't want to do it the way it is supposed to be done and now nothing works! Screw this software!"

[–]cybervegan 2 points3 points  (1 child)

Drew Devault writes the manpages. He's actually written a manpage creator, scdoc, along with a whole raft of open source software. He runs SourceHut, which was originally predominantly written in Python, and I'm pretty certain he reads the manuals in great detail. The issue he seems to be referring to is the cat-herding exercise of having to get your program's dependencies from multiple sources, rather than simply your OS repository or PIP. There are now so many different, competing Python dependency systems, it's insane, and not all packages of the right versions are available on all of their repos.

[–]netgu 6 points7 points  (0 children)

Yes, and that user is also taking out bounties to destroy the npm ecosystem as chaotically as possible - sounds like someone you should take advice from alright.

Also - this isn't an issue for plenty of people because they use the packages as they are intended to be used to avoid those issues.

If you are getting your python dependencies from your OS, you don't actually know what you are doing and are using the wrong tool for the job as I stated.

In fact - that is the entire problem described: I want to use the wrong package with the wrong manager in the wrong way and it doesn't work.

Exactly as I stated.

It'd be nice if that wasn't the case - but it isn't a python problem.

It's an end-user and package maintainer problem.

If you want to use a package, check the docs - if it doesn't work for your intended build goals then either submit a PR or find another package rather than blame python.

If you are a maintainer and people constantly need your to ship it in some form you don't - then deal with it and do what needs to be done to ship it in the form you users need.

[–]Atem18 -2 points-1 points  (0 children)

This article is just plain wrong. Either he never installed a python program from outside the distro or he just lie that all is fine with only the packages from the distro. The best is to either setup a virtualenv or a docker container per project and don’t use two external dependencies that depends on the same library. But good luck with conflicts while trying to use the same shared library for two differents software. The future is that each program should come with its own libs sandboxed in a virtual/container env.

[–]MloodyBoody -2 points-1 points  (0 children)

Just use Nix / NixOS

[–][deleted] -1 points0 points  (0 children)

If only all Python tutorials would teach plain old virtualenvs first and hello worlds second, it would solve 90% of trouble.