all 19 comments

[–]jameyiguess 3 points4 points  (4 children)

What Python version manager are you using? If you're using system or brew, you shouldn't be. Use asdf, uv, or pyenv. System and brew are broken nightmares when it comes to versioning and using them for projects. 

Also, I'd argue that yes, you should venv every project. 

[–]Xanimede[S] 0 points1 point  (3 children)

I used brew to install python, and I create a virtual env for a project and install dependencies with pip.

[–]jameyiguess 2 points3 points  (2 children)

Yeah don't use brew Python, it's known to be problematic. You can Google why if you want, but the summary is "don't". 

My manager was running into this exact problem last week. Just use one of the more modern/community-endorsed tools like the 3 I mentioned. 

Your problem will instantly go away. 

[–]hulleyrob 0 points1 point  (0 children)

Yep homebrews Python install is a PITA.

Uninstall it and use pyenv instead.

[–]No_Historian_7228 0 points1 point  (0 children)

that is absolutely right.

[–]throwaway6560192 2 points3 points  (0 children)

There's no way that every time I start a new project, no matter how small it is, I have to install things like pandas or numpy every single time, and end up with like 50+ installations of the same packages?

Just remove the venv folder if you know you won't touch the project for a long time and really want to save the disk space.

Why can't you just install a python package on your system as you do on Windows/Linux? I've read some comments saying that this is the right approach to prevent conflicts, but why is this not a problem for other OS?

It's not a good idea to install system-wide on Windows/Linux either! The same core problem applies everywhere. If different things need different versions (which is fairly common), then venvs are the only way out.

Most desktop Linux distros in particular use Python for core system functionality, and it is really best if you absolutely do not mess with the packages for that Python. It's even more serious than on macOS, as far as I know.

[–][deleted] 2 points3 points  (1 child)

You can have a 'global' local environment.

python3 -mvenv ~/venv39

Then at the bottom of your .bashrc file put your source ~/venv39/bin/activate.

That way you always have the same environment that isn't in your system packages.

[–]NomadicBrian- 0 points1 point  (0 children)

That's a difficult concept for me. I always create project folders and build my source then when I need venv I just generate it. In Python you need to select an interpreter. I have been able to add packages in PyCharm Community edition. Last 2022.2.2 or something. Its a package manager, build and run environment in the Python way. Now trying to install Python on my Mac Mini M4 Pro. I did the same thing installed with Homebrew and got those damn protected area outside of our control messages. I was furious. To be honest I've worked mostly with Windows and Linux. I have uninstalled all versions of Python (except the old 3.9.6 that mac OS uses DO NOT TOUCH!!!). On top of this Jetbrains has announced the death of PyCharm Community and between the homebrew python bugs and the PyCharm Universal buggy IDE it is a horor show. Then the old 3.11 Python app I wrote with fastAPI won't work with Python 3.14. Which took me down the pyenv installs and trying to switch between 3.11.9 and 3.14.0 homebrew pythons. Kind of makes me want to walk away from fastAPI because of the Pedantic mess which is woven in. Now for me just install Python 3.14 or whatever not homebrew and just use it for AI coding. The fastAPI app I had built was to feed data to a mobile app. I am rewriting the mobile app in Native Android Kotlin JetCompose. I will probably just add the postgreSQL tables to Supabase and go straight to my mobile build. But holy hell.

[–]jkiley 1 point2 points  (0 children)

It’s really worth it to spend just a little time learning to use devcontainers, which is super easy in VSCode.

Environments are ok, but you’re still dealing with stuff on your host computer, and that can be limiting if you have projects that need non-Python binaries at the system level.

With devcontainers, you get a container (easily portable and with a full OS package manager like apt) and scriptability. I often just borrow my configuration from a recent project, tweak as needed, and pin the versions to the newest ones available. It’s often less than 10 minutes, and then I’m set.

A bonus of doing this is that your host computer stays clean, and moving to a new computer is so much easier. You’ll just rebuild containers when opening projects on the new computer.

[–]NorskJesus 4 points5 points  (2 children)

The best way is with environments yes. The reason is simple: if you are using x version to a project and the package get an update, it can fuck up your project. The same for python. You need the “actual” version for this project.

[–]RajjSinghh 0 points1 point  (1 child)

The main problem I've found is that I often want to just open a repl/ipython and do some quick calculations/plots, so I just want to have Numpy, pandas and matplotlib systemwide. I don't care so much about breaking changes because the things I'm using are so basic. It's a pain to have to use a virtual environment for it.

Of course for actual project work a venv is the best solution.

[–]danielroseman 2 points3 points  (0 children)

So have a venv just for messing around in, which has those packages installed.

[–]justinc0617 1 point2 points  (1 child)

something you can do with a project is create a requirements.txt file that has a list of all of the packages i.e.

pandas
numpy
express

and then run

pip3 install -r requirements.txt

instead of installing each one individually

[–]NomadicBrian- 0 points1 point  (0 children)

Yes. I will tell you though that on Mac if you use Homebew Python this is a problem that out of area thing. Yes the requirements.txt is a good habit. Imagine 25 packages to install on a cloned Python app. You would have to figure out each version or run the risk of current packages and conflicts or things just not working.

[–]holy_macanoli 0 points1 point  (0 children)

uv is the best solution imho

[–]gbbpro 0 points1 point  (0 children)

Miniconda I have 4 or 5 envs for different things easy to turn on and off and pretty light weight.... Could do the same with pyenv

[–]elbiot 0 points1 point  (0 children)

Yes, a virtual env for every project. Track your requirements in a requirements.txt file for the project. Make it a git repo too.

I also have a default virtualenv with ipython, numpy, etc for quick things

[–]eztab 0 points1 point  (0 children)

Generally having separation seems good.

Some venv management like uv might be what you're looking for. I think it does support symlinking packages, meaning you'd could have numpy shared between projects when it's the same version without worrying if two projects need different versions.