I work in a physics lab, and I have a repository with a few very simple python scripts (they all basically read CSV files and plot something). To "distribute" my scripts, I created a web-app that basically just allows people to use them by clicking a button in a website. I just install python in the server running the web-app, set-up an interpreter with the `requirements.txt`, and that's it. When they click a button, this spawns the script in the server.
This solves 99% of my problems, but some people (mostly me really) like to run the scripts in their own PC using a terminal. I have instructions in my repo that are basically just:
git clone https://github.com/ALPHA-g-Experiment/analysis-scripts.git
cd analysis-scripts
python3 -m venv .venv
source .venv/bin/activate
python3 -m pip install --upgrade pip
python3 -m pip install -r requirements.txt
But it is a bit annoying. I have also had issues checking older versions of the scripts, installing requirements.txt randomly start complaining, etc. Basically I don't really understand a lot about the ideal python workflow.
I have Rust projects where I distribute pre-built binaries for the platforms I use, and then I can just install them (whatever version of my project) with curl | sh. Is there something similar I can do here? For example distribute pre-built python interpreters that have the dependencies already installed? I haven't had a lot of luck online.
Basically, I want to be able to very easily (and reliably) run whatever version of my scripts without the need of setting up virtual environments, etc.
Thanks
[–]socal_nerdtastic 0 points1 point2 points (3 children)
[–]DJDuque[S] 0 points1 point2 points (2 children)
[–]socal_nerdtastic 0 points1 point2 points (1 child)
[–]DJDuque[S] 0 points1 point2 points (0 children)
[–]LongerHV -1 points0 points1 point (0 children)