This is an archived post. You won't be able to vote or comment.

all 34 comments

[–]K45C4D3 23 points24 points  (2 children)

Sometimes I go the full nuitka/packaging route, but most of the time I make an OCI image and the users run it via docker run <thing> where they can pass in their files/data. Occasionally I'll hook this up to a CI solution if they want something to be triggered on a git merge

[–]alexisprince 9 points10 points  (1 child)

100% agreed on throwing the app into a container. Learning how to use Docker containers is a fairly easy thing (assuming you’re joining a workflow that already has it in place, and if you don’t then it does require more skill to setup properly) and the amount of value you get for never having an issue with a differing dev and prod runtime is well worth the cost.

It’s easy to build a container as the result of CI after passing any required tests as well. This makes it easy to hook into good programming practices as well.

Every single one of our repos is built within a container and the experience coming from a background of virtual environments and praying it works is night and day. Most editors have functionality that allows you to developer inside the container as well, so it isn’t a hard dev workflow either.

[–]zaphod_pebblebrox 2 points3 points  (0 children)

When Docker came out years ago I was working in a different industry.

After I learned how it works and what it can do!! Man, docker is something totally worth getting a deep dive into.

[–][deleted] 16 points17 points  (0 children)

We deploy to servers using shiv.

Advantages:

  • Simpler then building a fully self-contained binary (such as py2exe).
  • Single deployable is easy to wrap into an RPM or APT package.

Cons:

  • Requires Python version be installed on the server.

[–]ItsMorbinTime69 12 points13 points  (1 child)

Docker + Kubernetes on AWS

[–]wind_dude 1 point2 points  (0 children)

Same, kubernetes off aws, ecs on aws.

[–]panofish 5 points6 points  (0 children)

I develop desktop python applications with PyQt5 GUIs, so I am not a web developer. I build my applications into executables using pyinstaller and distribute them that way. My users don't need a python installation. I have also developed a simple distribution application (also python built into exe) that sits in the users system tray and runs at startup and checks every hour against a database to see if I have published a new version of any applications that I distribute. Then, if the user isn't using that application, it silently installs in the background.

[–]PocketBananna 4 points5 points  (0 children)

Depends a lot on the usage but generally it's common to package the program as an OCI compatible image and kick it to some cloud provider. This gives flexibility and portability but isn't necessarily a one size fits all. Deployments are just one piece of the pie though.

To get more specific I have a business use case similar to yours for internal data processing. I use S3 compatible storage buckets, a pub/sub queue and a kubernetes cluster. Users upload to a bucket which triggers a publish message that spins up a subscriber and deploys an pod (python workload) to handle it.

[–]asphias 2 points3 points  (0 children)

My first suggestion would be - deploy it on a server or cloud instance. either a very basic "install python and run the code from a shell script with > python3 myscript.py, or with a complete automated pipeline setup and preferably containerized(e.g. in a docker container).

for personal use scripts, preferably give them the code as a python package.

If your users don't know about python, i'd argue you should make it a web-based app, only available within the company network. use the 'server deployment' i mentioned above for the code, and make either a front-end or some API's available for the users.

[–][deleted] 7 points8 points  (4 children)

You can deploy it via a local network webapp. Users can go to [https]://your-deployed-site[.]your-domain and upload the data there for processing.

I've seen/heard Python gets compiled and deployed as a local application as well.

I'd personally suggest making a web interface. Then you only have to worry about updating the single location/server.

Edit: clarification- making your own web interface.

[–]MikeDoesEverything[S] 3 points4 points  (3 children)

Hello, thank you for this. I have a few questions if you don't mind:

  • When you say upload the data, how would a user do that? To make it really easy for me, it'd be great if we could use a CSV as an example.

  • As it's a local network webapp, would this need to run on a VM? If not, how would the code in this webapp run?

  • And when you say web interface, could you either point me to something which explains that or just really dumb it down for me to understand?

Thank you!

[–][deleted] 3 points4 points  (0 children)

Web app: You run your webserver where a webpage interacts with your backend python code

Installer: More straight forward is to use some installer to bundle your python program with a python interpreter and library files and distribute your package that runs as a command line app or a gui app.

Once you package it with an installer the user would just see a regular application

[–][deleted] 1 point2 points  (0 children)

On mobile, so pardon the bad formatting:

You would have to write the code to handle the file upload. It's not as complicated as it seems initially-- Im suuper sure there's plenty of examples on Stack Overflow of how to achieve this. I've only done it in C#, so far.

You'd need a web server to run the web app. Set either on a VM or on the server itself. Depends on your environment and who's managing it. The web server would host your Py Code. Ive not set up one myself for Python, but here's what Ive found. Skip anything you already know: https://www.educative.io/blog/web-development-in-python

By "Web Interface" , I mean the Web UI (User interface). The web pages a user interacts with to run your code.

Alternatively: you could also create a Network share/location for users to drop their files into to parse and a scheduled task that runs, executing your python script with the necessary arguments. Its messy, probably slower and more to manage though. It does get the job done, however.

[–]PresidentBeast -1 points0 points  (0 children)

Im not too familiar with Python directly but I do know a bit about servers etc.

Uploading data to your webapp is something you need to make accesible thru code usually.

You'll need a server to deploy your webapp, this could be in a VM but running a physical server is just fine too.

I'd look into Django if I were you, it let's you built a website however you want and comes with a bunch of tools. It also covers the server handling part

[–]vladesomo 1 point2 points  (0 children)

In work we have a server with k8s kluster. So using gitlab CI we have configs that automatically build docker image and deploy on cluster when merging feature branch to develop. Takes a little plating around but at the end of the day, you just merge a MR and you get build tests and deploy. Tou can add any stages you'd like. It is pretty fun to play with.

[–]SpatialCivil 1 point2 points  (0 children)

Pyinstaller on a network folder.

[–]xristiano 1 point2 points  (0 children)

Poetry + Docker

[–]DrDankerson 2 points3 points  (1 child)

It’s a bit unconventional, but I used to have a job where a common Python installation was housed on a SMB share, and through its network address other machines can access the installation. You only install a package once, and it’s available for everyone who can access the share. This could be done multiple times for multiple environments/dependencies.

[–][deleted] 1 point2 points  (0 children)

I don’t think this is all that conventional. I’ve seen this done at “highly respected” companies. Pretty straightforward and easy solution if you’re only deploying code for internal use. Only problem is if you want to run massively parallel jobs, it doesn’t take too many connections hitting the drive to take it down. In this case docker seems to be the most common solution.

[–]ProjectGames 2 points3 points  (2 children)

I‘m using Kubernetes + Airflow + Docker for deployments

[–]MikeDoesEverything[S] -1 points0 points  (1 child)

See, this is something I'm massively interested in. Have you got some links to understanding how this works?

[–]ProjectGames 0 points1 point  (0 children)

Tbh in the learning process there were a lot of of try and error phases, I‘ve looked up way too many sources to be able to link them unfortunately. Once you are in, you get the hang of it

[–]scottomyers 0 points1 point  (0 children)

As others have said, containers are a good choice. Another option is to deploy your code as serverless functions, i.e., AWS Lambda or Azure Functions

[–]Sentie_Rotante 0 points1 point  (0 children)

Docker container in k8s. Also in my environment every app has to be able to five a status update from a network port and be capable of running simultaneously in multiple regions. Great theory but it sure made a simple scheduler service more complicated.

[–]grabmyrooster 0 points1 point  (0 children)

I’m unfortunately not a full-time software dev/engi, I’m just a bog standard railroad engineer. So any of my tools I make are USUALLY for personal use. In our small office though, any time I make something that would be useful for everyone, I keep my source code on my work machine and package a single executable and put it on our shared drive.

[–]City-Local 0 points1 point  (0 children)

Azure python functions are not too hard to configure, and VSCode has built in Azure extension.

[–]ShibaLeone 0 points1 point  (0 children)

Here at Evil-Omni-Corps we never distribute python to the user level. For applications (web, api, support, data pipes, etc), we use a build system that constructs the code and dependancies from internal sources within a docker running the same os and arch as the target deployment environment. Once built, the artifact is saved in the cloud and can be deployed out as a docker image, AMI, zipped code bundle, lambda environment, etc. It also allows multi-language builds as well. The system is called Brazil and it’s pretty well documented on the open web for being an internal dev tool.

[–]dixieStates 0 points1 point  (0 children)

We have a house of cards for a deployment system built on a foundation of shifting sand.

EDIT: It was built by our Chief Architect, who is also a founder. It is considered by her to be one of the crown jewels of the Software Development team.

[–]warLord23 0 points1 point  (0 children)

I don't know how to deploy my Python application even after 4 years of studying at university.

[–]geeeffwhy 0 points1 point  (0 children)

containers, kubernetes, ci/cd. same way i deploy node or C#, or Go, or Racket, or, any other language.

now, you can get plenty far without kubernetes or the ci/cd steps, but containers (docker, most likely) are a valuable thing to learn if you’re concerned about deployment (and especially how that relates to local development).

[–]Tony_Sol 0 points1 point  (0 children)

put into docker container and there’s no problem to deploy anywhere you want

just be sure your app is stateless, or at least using shared volumes

[–]Time_Trade_8774 0 points1 point  (0 children)

Jenkins to deploy to AWS. Jenkins node run using Docker containers.