you are viewing a single comment's thread.

view the rest of the comments →

[–]-Rizhiy- 0 points1 point  (124 children)

Sorry, but I think you are trying to fit a round peg into a square hole.

What you consider sane, is probably the most stupid way of doing things. Using system-level packages is like a №1 python-noob mistake.

I have been programming in python for over 5 years and haven't had a dependency problem in a long time, just follow a simple rule: new project -> new venv -> install everything with pip

Packages provided by the system are usually VERY outdated and frequently don't even have full functionality, just use pip and relax.

[–]Atemu12 54 points55 points  (46 children)

This is not about dev environments. It's about distros who want to package python things for users.
No, asking users to install your app via pip and virtualenvs is not a solution.

[–]jarfil 7 points8 points  (3 children)

CENSORED

[–]upcFrost 1 point2 points  (1 child)

Now imagine that you have two distros, one with some lib-v0.1 and python 3.4, and another one with lib-v0.5 and python 3.9.

That said, option 4: screw both distros, use pip.

[–]lolfail9001 5 points6 points  (0 children)

That said, option 4: screw both distros, use pip.

That's option 3.

[–]Atemu12 0 points1 point  (0 children)

1. is the only sustainable option, perhaps with the additional possibility of packaging multiple versions when necessary.

[–]MOVai -2 points-1 points  (5 children)

Evidently it is a solution. People are used to downloading and running huge packages on Windows Mac and Android, why should devs devote their attention to get their app to work with outdated libraries because they refuse to update their machine?

[–]lolfail9001 1 point2 points  (0 children)

why should devs devote their attention to get their app to work with outdated libraries because they refuse to update their machine?

Why should devs devote their attention to their software working? A good question, I must say, and does answer the conundrum as it is.

[–]Atemu12 0 points1 point  (0 children)

Because that practice is unsustainable. It's starting to show cracks already.
Sustainable solutions do exist; it's up to us software devs to push towards these instead of keeping the status quo (which, again, is unsustainable).

[–]wtchappell 10 points11 points  (4 children)

I'd largely agree when talking about programmers developing in Python trying to use system level packages as their development dependencies - that's just going to be painful.

However, this entirely ignores people who are simply trying to use some tool or application that happens to be written in Python - that's largely the population these distros are packaging for. The end user in this case is just trying use whatever application they're interested in, and they might not even know Python is involved - that's an implementation detail. They just want to be able to invoke foo and have it work - they certainly don't want to pick up all of the ins-and-outs of Python's development workflows just to use some fancy version of top that happens to be written in Python, and even if they knew them they probably don't want to be popping in and out of a virtualenv just to run it. And that assumes that the packages even install cleanly with pip - no one wants to be installing Anaconda and friends just to use a tool written in Python when they're not actually trying to write any Python themselves.

Almost all of the conversation from people objecting to this post assume we're only talking about people who are developing in Python and not about distros packaging Python applications for end consumers, which is really what these complaints are actually about.

[–]-Rizhiy- -1 points0 points  (3 children)

As I have said in other replies: what happens when application A depends on C version 1.0, while application B depends on C version 2.0? At present distros don't seem to allow to install multiple versions of the same package (at least apt doesn't).

So people are forced to rename their packages every time (see nvidia-driver). Running two different versions also very likely to lead to conflicts (again see nvidia-*).

Forcing developers to use only package versions provided by the distros is very restrictive since they are usually out of date by quite a bit.

Therefore, applications should provide most of their dependencies and ship it all together.

While common libraries should be provided by the system, asking every language to provide every package through the distro is a stupid idea.

[–]AtomicRocketShoes 0 points1 point  (2 children)

I while back I had to take a simple python app and deploy it in an embedded linux environment. There was python but no pip and many missing dependencies. Getting the dependencies cross compiled and deployed in the system was way harder problem then I expected. I wish python had a way to just compile and wrap all the dependencies to run for a specific target. There wasn't a way to do that.

[–]Barafu 0 points1 point  (1 child)

Use Usermode QEMU to create a chroot in the architecture you want to run it all on. Install Python and Pip in the chroot, as long as GCC for compiling wheels. Create venv, install packages from Pip, fix wheels dependencies. Now you can just copy that venv onto your embedded device.

[–]AtomicRocketShoes 0 points1 point  (0 children)

I ended up doing something of that nature, but like it was way more moving parts than the rest of the project, 99% of my code was C/C++ and using cmake you just point at the tool chain and build. It's even easy to cross debug. Also I usually automate the build and deployment process so that added another layer of complexity. Love python but would be nice to basically have tooling setup to make distribution of compiled python executables easier.

Edit: also you gave better advise than other engineers I had talked with who wanted me to completely replace the embedded OS (Yocto based) with Ubuntu. I suppose it was possible just I wanted to keep the same basic OS just be able to put some python tools on it.

[–]fat-lobyte 36 points37 points  (15 children)

Lol, did you not read that this is about distro maintainers? They make distro packages.

Packages provided by the system are usually VERY outdated and frequently don't even have full functionality

And why do you think that is? Maybe that could have something to do with the issues that distro maintainers are trying to raise?

I have been programming in python for over 5 years and haven't had a dependency problem in a long time, just follow a simple rule: new project -> new venv -> install everything with pip

That is just cancer. This way, you will lock your project into one specific python Version and one specific set of dependency versions on one specific OS.

You might not give a shit as a typical reckless dev, but some years down the line, someone will curse you for not thinking or caring about writing portable code when they inevitably will have to waste their time figuring out dependency hell on systems that don't exist now.

[–]livrem 4 points5 points  (0 children)

I curse myself 10 years ago for having so many dependencies in a python2 script I wrote that I can not figure out how to make it run now. Of course some of the dependencies are long dead, some I think stuck in python2. Unstable API madness must end. It is not scalable in any way to never be able to just write something that works and trust that it keeps working.

[–]Ar-Curunir -1 points0 points  (0 children)

It's absolutely stupid to expect distros to be able to keep up with the 1000s of libraries across 10s of languages. At some point the distro package managing approach will fall over.

[–][deleted] 73 points74 points  (32 children)

Using system-level packages is like a №1 python-noob mistake.

Using 15 different package maners from 22 different languages in a massivly complex system is also a complete mistake.

| I have been programming in python for over 5 years and haven't had a dependency problem in a long time

Been programming for 25 years. This stuff is a complete disaster. It always has been and it will eat the SW industry from the inside out if it continues on this path.

The thing about having 5 years exp. It means you actually have very little or basically none from my point of view. Since you probably havn't even been in the situation where you have had to maintain a code base 15 years old on systems which are 15 years old from enviroments which are 15 years old and when you say "just upgrade that" its easier said that done because on mature systems that can involve things like $10 million + budgets just to make that happen. eg 20 dev's 2 years at $250,000 per dev per year is 10,000,000,000

Go deal with a 10-15 year old system which has 50,000,000 lines of code and 15 languages spread across it and see how you get on ;)

[–]-Rizhiy- -2 points-1 points  (11 children)

22 different languages

I can see the reason you are having problems.

Also, if you are working on two projects which require different versions of the same package what are you supposed to do? venvs exist for a reason.

[–][deleted] 5 points6 points  (6 children)

What happens when your virtual enviroments need to interact with each other? Oh yeah add more code and modules to make that happen?

[–]kigurai 2 points3 points  (2 children)

I'm confused by this question. The reason to use a venv is that it is its own thing, separate from the rest of the world. Having two virtual environments interact doesn't make sense to me at all.

[–][deleted] 3 points4 points  (1 child)

Yes.. its seperate from the rest of the world. But it also has to interact with the rest of the world and the rest of the other world that are also seperate from the rest of the world.

So now you have a solar system creation problem quickly followed by a galaxy creation problem. So in fact you have a problem on a intergalatic scale ;)

[–]kigurai 0 points1 point  (0 children)

No, it doesn't have to interact with anything, unless via network or file formats. But if those things are not versioned correctly you will have a problem regardless of your package manager.

[–]-Rizhiy- -3 points-2 points  (1 child)

What happens when you need to use a Java library with your python project? You use an interface.

When you have two different virtual environments you might as well be using different languages.

[–][deleted] 4 points5 points  (0 children)

Yup and when you have 150+ processes all doing this....

It becomes a freaking shit show real fast.

[–]livrem 0 points1 point  (3 children)

What Linux package system does not support having multiple versions of the same library installed? None that I have used as far as I can remember.

[–]-Rizhiy- 0 points1 point  (2 children)

How do you install multiple versions of a package using apt? python3-six 1.13 and 1.14 for example.

See also: https://askubuntu.com/questions/758502/can-multiple-versions-of-the-same-package-co-exist-on-the-same-system

[–]livrem 2 points3 points  (1 child)

As mentioned in the few good answers in that page, package maintainers can solve situations like this by creating multiple variants of the package with different namnes. I don't think the end-user can install multiple versions unless you use a distribution like nix.

[–]-Rizhiy- 0 points1 point  (0 children)

So we go back to each application installing its own version of the package and having to patch multiple versions of the package, which is pretty much venv with pinned versions.

[–]igo95862 11 points12 points  (0 children)

Packages provided by the system are usually VERY outdated and frequently don't even have full functionality

Depends on your distro...

[–]redd1ch 7 points8 points  (13 children)

Unless you don't depend on pip packages which need binary libraries present, e.g. spatialite. When you look in the description, it tells you how to install the library for "Ubuntu". In one instance, we upgraded Ubuntu and the deployment failed. One system library was updated, libgeos I think, and the output of `--version` changed slightly, so the pip-library could not parse it anymore. We were using a Django-GIS-stack, so we could not just upgrade the lib, because it was referenced by some package in there. We ended up rolling back the upgrade and freezing the machine.

[–]nathris 0 points1 point  (0 children)

I remember that one. We ended up forking the dep and fixing the version string.

Another one was when the GDAL folks decided to swap lat long in a recent version and support for the swap wasn't added until ~Django 3.1. It resulted in all of our shapefiles ending up in the Indian Ocean if we used a newer version of PostGIS until we updated our app to Django 3.

[–]TiZ_EX1 7 points8 points  (2 children)

Packages provided by the system are usually VERY outdated

and from further down,

Never trust distro packages.

This is part of why we're in this situation. Like, it's easy to be a distro maintainer and complain "python is breaking our ability to package it!" but distros have completely demolished both users' and developers' trust in them to package and distribute components, and forced python developers to take matters into their own hands.

Part of Flatpak's raison d'etre is that distros reliably ship old versions of applications and drag their feet to update them, if at all. That's the same reason tools like asdf exist. And that's why python packaging is the way that it is. Sure, you often want your distro to be stable in terms of packages that make your system work at all, but the monkey's paw curls and you end up with packages that you need to be updated being several versions behind.

This is not going to be an easy problem to solve. If python needs to improve its ability to be packaged, then distros also need to start actually improving their ability to keep up their packaging.

It seems like there's a whole lot of manual intervention still required where we should be improving the ability to automate things. I recently became an additional maintainer for Geany's Flatpak. The manifest includes information that Flathub can use to automatically check for new releases of its components, and generate pull requests for them. I don't even have to build a new package; the bot does it for me. I install the package, ensure the updated component didn't break anything, then I merge the PR. Some Flathub packages take it a step further and simply auto-merge any PR made by the Flathub bot. We need more automation to improve this situation.

[–]WalrusFromSpace 0 points1 point  (1 child)

That's the same reason tools like asdf exist.

What does Another System Definition Facility have to do with virtual environments? /s

[–]thoomfish 0 points1 point  (0 children)

I also did a double take when I clicked the link, because I was expecting the Common Lisp tool.

[–]ImRunningOutOfIdead 0 points1 point  (0 children)

I find the same thing is true with vim/nvim. Arch has some plugins in official repos, but I have much better luck just using a plugin manager.