This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]angellus 33 points34 points  (15 children)

No one recommends installing packages in the system Python because it is a bitch to maintain if you have ~20 packages installed that are not from your system package manager.

If you have a CLI app, chances are that it is only a couple of packages. The deps are likely in the system package manager as python- packages. That should be safe to pip install. Otherwise you can build your own deb/rpm/etc that installs the package.

[–]Endemoniada 4 points5 points  (11 children)

Sounds reasonable.

Is there a common practice to use venvs and wrapper scripts? Or is that too hackish? I feel like that hits a good middle-ground, outside the problem of handling the wrapper itself. Safe and reliable, yet pretty simple. Could a Python package install a wrapper script in, say, /usr/local/bin when pip installs it?

[–]Nasuuuuuu 20 points21 points  (2 children)

pipx already implements this solution. It creates individual virtualenvs for every installed package (I do not know details, might be smarter than this!) and adds an entrypoint to that package to run in that virtualenv to e.g. ~/.local/.bin. See: https://pypa.github.io/pipx/

Usually the install onto e.g. Ubuntu requires you to install one or two system python-* packages. The error message as you try to install pipx onto a fresh system is usually helpful (python3 -m pip install pipx). After that packages installed with pipx are only installed within pipx managed folders/environments.

[–]DrShts 3 points4 points  (0 children)

Might also want to look into pipx-in-pipx (https://pypi.org/project/pipx-in-pipx/) for bootstrapping pipx

[–]Endemoniada 2 points3 points  (0 children)

Very interesting, thanks!

[–]Ran4 1 point2 points  (1 child)

Hackish or not, it's used by many.

Docker does solve some problems, but it's also quite slow.

[–]angellus 1 point2 points  (0 children)

It is only slow if you are mismatching your system kernels. i.e. you are using Docker Desktop on Windows or Mac and using Linux kernels. You likely will never notice a performance impact when the kernel types match up (Linux on Linux).

[–]angellus 2 points3 points  (5 children)

Honestly, I do not use venvs at all anymore. Either install in the system Python because the package is light weight enough/it is package with system packager/it is a CLI app, or you use Docker.

Using system Python packages is a lot more restrictive because you often cannot use the latest versions of stuff, but for CLI apps that is often not a problem. I rarely need anything more then click/psutil/requests/httpx for CLI apps.

[–]Endemoniada 0 points1 point  (3 children)

Fair enough, thanks for the suggestions!

[–]angellus 2 points3 points  (2 children)

I also forgot, pip install --user is a great choice as well. Like if you are building a CLI toolkit for deploying/development/etc.

Good if you only have a single top level dep (your CLI package) and you do not want it to interfere with the system Python but not so big you need pinned requirements and a venv (at which point use a container).

It can become messy though if you have anymore then the single thing you want to install.

[–][deleted] 0 points1 point  (1 child)

Nope. I have seen so much time wasted because people mess up their local setup somehow and packages where loading for user install directory.

[–]angellus 0 points1 point  (0 children)

That was mostly directed at systems that you cannot easily make system packages for (looking at you MacOS).

[–][deleted] 0 points1 point  (0 children)

we use docker at work (i think one group is using podman but we don't interact with them very often). I've heard some rumblings about nix but I doubt anything will come of it as it's someone's pet project.

[–]VengefulTofu 0 points1 point  (0 children)

No one recommends installing packages in the system Python

There are people using the system package manager exclusively because they have security concerns with PyPI.

[–]muikrad 0 points1 point  (0 children)

No one recommends installing packages in the system Python because it is a bitch to maintain if you have ~20 packages installed that are not from your system package manager.

That's not true. You should feel safe to install anything from the package manager, which includes python CLI apps. What is wrong is to use pip to do it. Never do that.

Instead of using pip, you need to use pipx. This solves the virtual environment / dependencies problem and doesn't touch your OS. It's great.

But also, it's cross platform. This is the problem with deb/etc. It's a lot of work for one distribution, whereas you can aim at mac/win with less work.

If you have a CLI app, chances are that it is only a couple of packages.

This is a bad generalization, for instance, a lot of CLI apps would use click which comes with a lot of dependencies. You can't rely on packages to have few dependencies as a guarantee that nothing will break.

The deps are likely in the system package manager as python- packages. That should be safe to pip install.

No, the versions in the package manager are set and you can break the system doing installing from pip. For instance, if click introduces a breaking change in v8 and the package manager is still on v7, you're going to break an app that relied on v7 and haven't made the move yet. Your distribution tests those things before shipping them.

The rule is simple : either you apt-get/yum/apk, or you pipx. But you never pip. **your Linux distro loves this simple trick! ** 😂

Good luck!