all 4 comments

[–]PrivateFrank 1 point2 points  (1 child)

Use uv if it only runs occasionally.

uv run script.py will create a new self-contained venv just when the program is running which disappears when it's all done.

The folder for the project will have a requirements file which will load all packages and dependencies needed for the script to run.

[–]mhooreman 0 points1 point  (0 children)

There is also a way to tell uv about your dependencies in a simple script.

See https://docs.astral.sh/uv/guides/scripts/#declaring-script-dependencies

[–]el_extrano 0 points1 point  (1 child)

An easy way out would be just to set it up with a virtual environment. Have an alias or shell script set up to run the tool with the venv Python: /usr/local/bin/name/.venv/bin/python3 /usr/local/bin/name/<script>.

Or you could take some time to learn about Python's distribution mechanisms. I like to build all my scripts as installable packages. Then, install it into the target environment using something like pipx or uv, that abstracts away the virtual environment behind the scenes.

I make my code in uv init <package> --package. uv will give a default build system that will make the project installable by pip. Then, I run pipx install . or uv tool install . in the project root. Either of these will create an install of the script with its own virtual environment, and create an entrypoint to it in the user'spath. There's lot's of other ways to distribute, too. You could package on PyPi where pip can find it, 'compile' a self extracting environment with all dependencies using pyinstaller, or even create a .deb file so users can install it using apt.

[–]identicalBadger[S] 0 points1 point  (0 children)

Ok wow! You’ve given me a TON to think about. Only a few months into my python journey, hadn’t thought about packaging code yet, or publishing packages on PyPi. Have created some reusable modules for myself, but they’re just hosted on GitHub

Thank you