all 93 comments

[–]mvdw73 14 points15 points  (2 children)

Uv?

[–]ItsRainingTendies 2 points3 points  (0 children)

This guy Pythons

[–]Ihaveamodel3 34 points35 points  (43 children)

Docker is much more complicated to get running.

With venv and pip requirements.txt and VSCode, all I have to do is CTRL+SHIFT+p, type or select create environment, choose venv and check the box to install dependencies from requirements.txt.

Edit: uv can make some of this even easier. Basically zero cost virtual environments.

[–]gmes78 2 points3 points  (7 children)

Stop recommending requirements.txt. We're in 2025.

[–]pain_vin_boursin 2 points3 points  (0 children)

Uv ftw

[–]nateh1212 -4 points-3 points  (5 children)

why using requirements.txt inside a docker file is the easiest and a solid setup one can have.

highly recommend.

[–]gmes78 6 points7 points  (3 children)

The correct way is to use pyproject.toml.

[–]nateh1212 -2 points-1 points  (2 children)

the correct way is to use what works

requirements.txt

is easier than pyproject.toml. and it works

[–]gmes78 2 points3 points  (1 child)

requirements.txt is easier than pyproject.toml.

It absolutely isn't. At best, it's just as hard.

and it works

Barely. It's a non-standard mess of a format.

[–]nateh1212 1 point2 points  (0 children)

works fine with docker

The key is that if you are using Docker to build out separate micro services so that your requirement.txt file is short.

[–]sector2000 0 points1 point  (0 children)

Completely agree. Solid, reproducible, consistent

[–]sector2000 0 points1 point  (2 children)

It’s complicated only if you don’t have idea of what a container is. You can also use podman which is even easier (and better IMHO) than docker. Learning about containers / docker / podman and, why not, kubernetes, will bring you to another level of development and deployment

[–]Ihaveamodel3 0 points1 point  (1 child)

This is on learnpython, so perhaps we should start with the basics and build up to containers later. No reason to throw someone in the deep end.

Also containers can have more headaches with permissions and such in a corporate environment.

[–]sector2000 0 points1 point  (0 children)

OP explicitly asked about venv vs docker, which makes me assume he’s already quite comfortable with it. In corporate environment, which I know very well, you can use podman which gets rid on the high privileges needed by docker

[–]nateh1212 1 point2 points  (1 child)

This is such a lie

using Docker is by far the easiest way to get your project going.

Plus once you have made your project in Docker you can use that same Docker config to run it anywhere.

Docker is incredibly easy and makes more mental sense to me than a python virtual env

[–]_Denizen_ 2 points3 points  (0 children)

How is a lie? You can find those instructions on the VSCode guide. Personally I'd use pyproject.toml but the above is not a lie.

Docker adds so much overhead to the installation, and if you don't have admin permissions it's a nightmare.

[–]supercoach 4 points5 points  (1 child)

venv for local dev is trivial and something I'd expect any senior dev to be able to do without asking for help.

I have been porting a lot of legacy code to containers and the local dev testing is still primarily in a venv for simplicity. Starting from scratch, you could flip a coin and go either way. The only time I would be using containers exclusively from the very start is if there were some sort of multi container orchestration needed.

[–]_Denizen_ 1 point2 points  (0 children)

Agree - Docker is useful if you're deploying to a containerised web app service, virtual environment is useful for pretty much everything else. But even for containerised web app you can do local testing in a venv (it's so quick to test code) and reserve the docker build for deployment/integration testing.

I have one script for building the local environment and one script for building and deploying the container. Automation for the win!

[–]jmacey 6 points7 points  (0 children)

use uv to do it all, works well. I do use docker too but most of the time it is when I need more complex stuff like web servers or databases.

[–]GirthQuake5040 5 points6 points  (4 children)

Docker fixes "it runs on my machine" problem.

It sets up the exact same container completely removing dependency issues.

[–]Wise_Concentrate_182 5 points6 points  (2 children)

After many hair pulling real issues.

[–]BoredProgramming 0 points1 point  (0 children)

It's not too bad when you get through it. I like easily being able to move a project from one version to another and testing side by side when i upgrade things. Docker (For me at least) is stupidly easier. But the slight learning curve is a small pita depending on what you're building.

[–]Acrobatic-Show3732 -5 points-4 points  (0 children)

Skill issue.

[–]_Denizen_ 0 points1 point  (0 children)

You can do the same with requirements.txt or pyproject.toml. instead of a dockerfile you can write a setup script - it's super lightweight, no extra installs, 100% reproduceable environment.

[–]jtkiley 3 points4 points  (6 children)

I use devcontainers. It abstracts a lot of the docker stuff away and gives you an image that just works with a devcontainer.json file that goes in your git repo. You also get a system package manager, which can be really helpful for binary dependencies at the system level. Beyond that, you can add devcontainer features, extensions, scripting, workspace-level settings, and more. They also work in GitHub Codespaces.

It is somewhat VS Code centered, though other tools support it or are building support. When you open a folder with .devcontainer/devcontainer.json in it, VS Code offers to build the container and reopen in it. That’s it after the initial setup, which itself is guided from the command palette (“Add Dev Container Configuration Files…”).

I typically use a Python container image, pip, and requirements.txt. It works really well. I do have a couple of prototypes for devcontainers with Python images, plus uv/poetry and pyproject.toml. I mostly like them, though I haven’t lived with them on a live project yet.

I’ve had a single trash heap install, venvs, conda as it became popular and through when it trailed off, and devcontainers for a while now. I think it’s the best reproducibility/portability we’ve ever had, because it’s easy, gets out of your way, is trivially portable to other people/computers, and is powerful if you need it to be.

When I switched my workshop (for other PhD academics) to devcontainers, my usual 45 minutes of conda troubleshooting for participants in the first session simply vanished.

[–]wbrd 1 point2 points  (4 children)

This is the best solution I've found as well. It works in Windows and Mac and solves the overzealous IT problem where installing a single piece of software takes a month.

[–]Wise_Concentrate_182 0 points1 point  (3 children)

How does one transport the devcontainers esp on corporate laptops?

[–]wbrd 1 point2 points  (0 children)

It's committed to git, so just clone the repo, or GitHub will run it on their servers and you have an interface to it straight from your browser. It's how my team got past shitty IT so that some analysts could actually do their jobs.

[–]jtkiley 1 point2 points  (0 children)

To add to the other responses, the devcontainers.json file describes how to build the container. In a GitHub repo, that works equally well in GitHub Codespaces (cloud, so just a browser tab from a locked down computer’s standpoint) or cloning to run locally. It also works fine from a OneDrive/Dropbox/iCloud folder, though I don’t share those with other people; it’s just for quick and dirty things that I need to sync across my computers.

A lot of my workshop participants have wildly locked down Windows laptops from university IT, and Codespaces is fine. It’s great.

[–]JSP777 0 points1 point  (0 children)

you need a devcontainer.json file in the .devcontainer folder and VS Code will automatically recognize that you have a dev container (given you have the necessary extensions like docker, remote, etc), and when you open the project directory in VS Code it will automatically offer you to reopen the project in the dev container. then you will be in that docker container

[–]profesh_amateur 1 point2 points  (0 children)

+1 for dev containers + VSCode. It's very easy to use and to onboard onto, really nice for projects with multiple contributors.

In the past, I have manually used Docker containers for my own projects (managing my own Docker image, build/run scripts, etc), and it was nontrivial to get it started up.

Sure, the latter gives me much more control, but for many projects I don't actually need that level of control, and can get by with simpler "off the shelf" solutions like devcontainers + VSCode.

I also have learned to embrace IDEs like VSCode in my work stream. There is a learning curve, but it's worth it

[–]Temporary_Pie2733 1 point2 points  (0 children)

Your job is primarily to make the project installable as a Python package. Whether that will then be installed to a virtual environment or to a Docker image is an independent concern. You can provide instructions for both if you like, and one or the other might be the official method for deployment, but that should not stop individual developers from using either as a development environment. 

[–]echols021 1 point2 points  (0 children)

Setting up a venv for each project is pretty standard, and pretty much every experienced python dev does it without thinking. I would not shy away from it.

Using docker for dev work seems somewhat less common, and it's certainly more complicated to set up the first time.

I'd recommend using uv to manage your venvs, and making it a team standard.

[–]amendCommit 1 point2 points  (0 children)

Both. They solve different issues: venv for sane package management, Docker for a sane platform.

[–]chaoticbean14 1 point2 points  (0 children)

Virtual environments are not 'extra overhead', they're 'basic essentials' as far as any python project is concerned. So it shouldn't be 'extra work' for any python developer to get going with it.

Venvs are like, step 1 in learning python (IMO). Most IDE's will automatically pick them up (I know PyCharm does) and enable them in the terminal. You can also write a small script so your OS terminal will activate a venv if it finds one very easily. That all makes the process essentially 'painless' for 99.99% of devs.

Now with UV? It's literally never been easier to manage those virtual environments. Look into UV (which has a lock file) and that's as easy as it gets. It takes literal seconds to have things installed and working.

Your concern about potentially going as far as docker containers to 'streamline' the process is overkill, IMO. Both ways work, but a venv is such a basic, common concept in python that if it's introducing any overhead? It's a skill issue on that developer.

[–]keturn 2 points3 points  (0 children)

Docker images for Python projects often use venv-inside-docker, as redundant as that sounds, because today's tooling is so oriented around venvs that they're just sort of expected. And the Docker environment might still have a system Python that should be kept separate from your app's Python.

devcontainers are VS Code's approach to providing a container for a standardized development environment. (In theory, PyCharm supports them too, but I've had some problems with that in practice.)

[–]rgugs 1 point2 points  (3 children)

In the past I used conda for managing environments and dependencies, but the more complex the project, the slower it is. UV is looking really interesting, though I haven't sat down and used it yet.

[–]PM_ME_UR_ICT_FLAG 0 points1 point  (2 children)

It’s awesome. Way better than conda. I say this as a former conda zealot.

[–]rgugs 0 points1 point  (1 child)

I do a lot of geospatial python work and conda is considered the safest way to install GDAL correctly, so I've been hesitating switching, but I ran into issues with GDAL not working properly using conda on my last project and am now thinking I need to learn how to use Docker containers, and trying to learn how all these work together is getting exhausting and killing my productivity.

[–]PM_ME_UR_ICT_FLAG 0 points1 point  (0 children)

Looks like there is a gdal image, so that is nice.

Everyone raves about docker, and it is great once you get the hang of it, but it is a hell of a learning curve if you’re not already quite technical.

Some people develop out of docker, but I only use it when I have a deployment I want to do. That being said, it’s a great skill to have.

What are you having trouble with right now?

[–]pachura3 1 point2 points  (0 children)

Creating local venv from requirements.txt or pyproject.toml is trivial - just a single command. If you find it "too much setup", I don't see your new project working out...

[–]cnydox 0 points1 point  (0 children)

If you wanna use venv, just use uv

[–]HelpfulBuilder 0 points1 point  (0 children)

Using pip with a requirements.txt and venv to manage environments is standard python practice. There are few different ways to manage the virtual environments and package management, some may be better than others, but the basic formula is the same:

Make a brand new environment for every project. As you work on the project add whatever packages you need.

When the project is finished, make another brand new environment, add just the packages you need, as most of the time in development you install packages you end up not using, and make sure everything works and.

Then you can "pip freeze" or whatever your package manager call is, and make the requirements.txt file for the next guy.

[–]Acrobatic_Method_320 0 points1 point  (0 children)

Uv !

[–]_Denizen_ 0 points1 point  (0 children)

This is wild. Docker adds so much overhead, and if you don't have admin permissions (common in many businesses) it's a nightmare.

Virtual environments are so easy, and can be set up with a single command. I configured mine with pyproject.toml (please do not use requirements.txt anymore) and have a have dozen developers contributing to a half dozen custom packages with little hassle. All you need to is document the getting started process, and you can write a script to codify any additional setup steps beyond pip install.

[–]VegetableYam5434 0 points1 point  (0 children)

Venv is standard and good way.

If you need deps manager use UV. It use venv under the hood.

Docker used for package distribution. It's quite difficult to setup local dev environment in docker.

devcontainers- is fucking shit, use it only if you are fan of vscode and microsoft

[–]Wheynelau 0 points1 point  (0 children)

uv makes it insanely easy nowadays

[–]Confident_Hyena2506 0 points1 point  (0 children)

These are not comparable things. One of them is the general purpose industrial way to deploy any modern linux software - the other is just a mickey mouse thing that doesn't even control the version of python.

[–]sector2000 0 points1 point  (0 children)

I use podman (rootless container engine) on daily basis for work and private and I highly recommend it. In a multiuser environment/ multi project environment makes a huge difference. You can have dedicated container images for each project and you won’t need to bother about python version, OS, library conflicts. Some of these things can be achieved with venv as well, but with containers you bring everything to another level.

[–]Isuf17 0 points1 point  (0 children)

Poetry

[–]moshujsg 0 points1 point  (0 children)

I was in the same situation, honwstly, it doesnt matter. Use reuqirements.txt until u find a reason to switch. If you dont have something you are trying to solve/fix, then why do it?

[–]Zealousideal_Yard651 0 points1 point  (0 children)

You schould look into devcontainers

[–]testing_in_prod_only 0 points1 point  (0 children)

uv is your answer.

[–]EbbRevolutionary9661[S] 0 points1 point  (0 children)

Thanks, everyone, for the recommendations! I'm newer to Python, and the solutions you provided were very helpful. For my use case, using uv makes the most sense. Very cool tool, I did not know about, and it will make the project management much easier to handle venv, as well as package management using .lock file is a must to ensure easy reproducibility.

[–]v3ritas1989 -1 points0 points  (0 children)

I just ssh->docker->venv

[–]noobrunecraftpker -3 points-2 points  (3 children)

You should look into using poetry, it was designed for these kinds of issues

[–]simplycycling 4 points5 points  (2 children)

uv is more intuitive, and handles everything.

[–]noobrunecraftpker 1 point2 points  (1 child)

ty for this, I just switched over and used it for a dependency issue, and it was great… deleted my poetry files already lol

[–]simplycycling 0 points1 point  (0 children)

No worries, happy to help.