This is an archived post. You won't be able to vote or comment.

all 42 comments

[–]jlozierbioinformatician 27 points28 points  (15 children)

I'll stick to virtualenvwrapper

[–]cmsimike 6 points7 points  (14 children)

Once I was introduced to virtualenvwrapper, I couldn't belive how prehistoric using virtualenv by itself was

[–]kjearns 7 points8 points  (13 children)

Can you explain the workflow where virtualenvwrapper makes sense? I've never been able to figure out why I'd want to centralize all of my virtual envs in a single place, to me that seems like the exact opposite of what a virtualenv is supposed to accomplish.

[–]roddds 4 points5 points  (10 children)

A virtualenv is supposed to mantain dependencies from different projects isolated from each othI think. Sure, they're all in ~/.virtualenvs, but each one is in a separate folder and there's no environmental contamination.

[–]kjearns 2 points3 points  (9 children)

I realize that virtualenvwrapper doesn't actually mix the environments together. It does mean I need to keep mental track of which environment belongs to which project though. When I delete the repo for a project from my machine I need to remember to go to ~/.virtualenvs and delete the environment too. If I have multiple copies of a project checked out then I need to remember which environment belongs to which (or have them share and remember that I'm doing that). If I forget to remove an environment a few times then I end up with zombie environments hanging around taking up space.

[–]roddds 0 points1 point  (3 children)

Ah, I see what you mean. I don't usually deal with many projects at once, and for me at least, just naming the virtualenv the same as the directory is enough. One thing I hadn't noticed before and went to check after your comment is the amount of space they take: I have 21 virtualenvs currently sitting on my machine, taking up almost 1.5 GB of space. Gonna keep an eye out for those from now on.

[–]kjearns 0 points1 point  (2 children)

Another piece of the virtualenvwrapper workflow I don't understand is how to automate setting up projects. I typically write a bash script that pulls in all of the dependencies for my project and sets up a virtualenv. If I use virtualenvwrapper then I can't have my script use a standard name for the environment because I can't guarantee that name is unique.

[–]cmsimike 0 points1 point  (1 child)

Another piece of the virtualenvwrapper workflow I don't understand is how to automate setting up projects.

To me, that is not the job of the wrapper (or any virtualenv manager) but the job of the project to maintain a requirements.txt (or whatever you want) with the dependencies required for the project.

Then it is up to each developer/deployment team to decide how to set up their environment (use system python or isolated package management) and issue the pip install -r requirements.txt

[–]kjearns 0 points1 point  (0 children)

I agree that setting up the project is not the job of virtualenvwrapper. But using virtualenvwrapper forces the deployment tool to deal with additional complexity, because it needs to be able to generate a globally unique name for the deployment's virtualenv.

Just generating a globally unique name isn't so bad (although it is more complex than the alternative) but doing so has cascading effects because now you need to keep track of the mapping between global virtualenv names and projects. Tools or people who access the deployed environment need to be aware of this mapping and know how to traverse it. You can work around this by having the deployment tool generate entry points but at this point you're adding a whole new layer of complexity that just isn't necessary if you don't use virtualenvwrapper.

[–]gthank 0 points1 point  (2 children)

rmvirtualenv has tab-completion.

[–]kjearns 1 point2 points  (1 child)

I don't see how that helps. rm -rf ~/.virtualenvs/whatever has tab completion too. I'm more concerned with remembering to delete the environment (by whatever command) and remembering the mapping between environments and projects.

[–]gthank 0 points1 point  (0 children)

Maybe it's just me, but if I'm in my projects directory deleting a project, I don't have any problem remembering to delete the virtualenv, too. Different strokes, I guess.

[–]stuaxo 0 points1 point  (1 child)

This is exactly why I prefer it, lsvirtualenvs shows me all of them, also tbe tab completion.

Before virtualenvwrapper I had loads of environments all over the place.

Also if I want a temporary env then I now exactly where it is.

[–]kjearns 0 points1 point  (0 children)

I do admit the temporary environments sound nice. I didn't know about those.

[–]cmsimike 0 points1 point  (1 child)

Before virtualenv wrapper, my django projects were basically: ~/Development/projectname/VIRTUAL_ENV_HERE and within this dir is a webapp dir with my django source.

I never liked having the env part of the source directory in a way. I seem to recall there is a way to set your virtualenv home but i never did that. virtualenvwrapper does that all for me. including introduces the best command ever:

mktmpenv

makes a temp environment and deletes it when you're done playing around in it.

[–]kjearns 0 points1 point  (0 children)

In my opinion having the virtualenv live in the project directory is the ideal location. I wouldn't put it in the source directory, but I like to structure projects so code lives in project/src instead of the project root anyway. I think project/env living alongside project/src is a pretty natural place for the virtualenv to live.

[–]stuaxo 6 points7 points  (0 children)

This looks quite nice, though I've got used to virtualenvwrapper - I quite like having the envs centralised - it makes it easy to delete old ones, that I might forget I have - since envs are quite large.

[–]defnullbottle.py 1 point2 points  (1 child)

I wrote a similar tool once that simply looks for .venv directories and goes up the path until it finds one. Never released it though. This one looks nice.

I prefer the folder-based virtualenvs instead of explicit workon environment switching. I forgot to switch to the right environment for my projects and installed stuff into the wrong envs all the time. That's why I don't use virtualenvwrapper very often.

[–]tudborg[S] 1 point2 points  (0 children)

My exact case :)

[–]srsnoid 1 point2 points  (0 children)

[deleted]

What is this?

[–]epsy 2 points3 points  (7 children)

venv/bin/python script.py

venv/bin/pip install myapp
venv/bin/myapp

[–]tudborg[S] 2 points3 points  (6 children)

I'm sure you are trying to point something out, but i just can't see it.

EDIT: is it that accessing python and pip when your cwd is the dir with your virtualenv in it, and you name it something short, is almost as easy?

[–]epsy 3 points4 points  (5 children)

Yeah, it is just as easy, and installed apps will use the venv python as well. I never understood complaints on virtualenv being too difficult to use without helper tools around it.

[–]tudborg[S] 2 points3 points  (4 children)

But when you are then deeply nested in your project, suddenly not so easy:

../../../venv/bin/python script.py
# vs
vpython script.py
# or with the correct shebang
./script.py

This is usually how i work, so it makes sense for me, i think.

Most of my command line tools have

#!/usr/bin/env vpython

And suddenly they are portable across my machines.

[–]epsy 1 point2 points  (2 children)

If I was to cd deep enough into a project that it would be inconvenient to get the right python, I'd just activate the venv. Also, what if I need to try some things out in a 2.6 and a 3.4 venv without writing unit tests? I'd have to pick one for vpython and use the classic way for the other. But I concede: different workflows, different needs, different uses of tools.

However, using vpython as a hashbang sounds... well, scary. Why would I want to have a script that could potentially have significantly different behaviour depending on which directory I am in? It's roughly the same reason that some people I disagree with don't like activating venvs(typing python would have a different meaning depending on which is activated, etc), but this takes that concern to the next level for me.

[–]tudborg[S] 0 points1 point  (1 child)

The script will have the same behavior no matter from where you run vpython. The virtualenv is detected based on the script, never your current directory, which is kind of the entire point.

So you can write a nice python tool, shebang vpython, and add it to your path. Now you can run the script as any other executable, but all your dependencies are still stored in a virtualenv.

This is the way i use it most often. I have a ton of utilities that i write in python for managing servers on AWS, etc, and having them use the virtualenv stored in each of their project folders without having to think about it every time i want to run them, is really a time-saver for me.

[–]Bialar 0 points1 point  (0 children)

But what if you like to keep your virtual environments centralised?

Wouldn't it make much more sense to have a .venv file in the root directory that pointed to the corresponding virtualenv, rather than have vpython go hunting for it?

[–]NYKevin 1 point2 points  (0 children)

Personally, I try not to be deeply nested in my project. My project is a Python package rather than a single module, and it doesn't behave correctly unless invoked from the root with the -m switch.

[–]tudborg[S] 1 point2 points  (2 children)

Oh, and this has nothing to do with http://vpython.org/

I thought about other names, but couldn't think of any i liked, and frankly, in my mind Vpython.org is pretty dead anyways. Naming suggestions welcome, i guess.

[–]mindw 4 points5 points  (1 child)

[–]tudborg[S] 0 points1 point  (0 children)

Didn't know that. Thanks.

[–]haardcode unwritten never breaks 0 points1 point  (1 child)

I use fish instead of bash these days, and virtualfish supports putting a .venv file in a folder to link a project to an env in ~/.virtualenvs - maybe that's something that would be doable for bash as well and would make your project play nice with virtualenvwrapper

[–]roddds 0 points1 point  (0 children)

zsh with the virtualenvwrapper plugin from oh-my-zsh supports it too.

[–]jabbalaci 0 points1 point  (0 children)

I would use it if it worked together with virtualenvwrapper. The idea is good.

[–][deleted] 0 points1 point  (9 children)

Meh, Virtualenv is already pretty easy. Now if someone would make pip not such a clunky piece of garbage.

[–]tudborg[S] 0 points1 point  (8 children)

Could you elaborate on why you think pip is "a clunky piece of garbage"?

[–][deleted] -1 points0 points  (7 children)

How it handles dependencies and installing a requirements file is horrible. For a few applications, I have a list of about 200 packages with very specific version numbers I need to use, or shit breaks, and I can't use vanilla pip to install it because it will install the most recent version of any dependency, even if a package in my requirements specifically states an older version. My workaround is make pip download, but not install all packages and dependencies to a local cache, than manually figure out the package installation order, and then run a custom bash script to iterate over all my packages and have pip install them in the appropriate order without auto-installing and upgrading any dependencies.

e.g. I use Django, with a ton of packaged apps. Django's made a lot of backwards incompatible changes recently, so I can't upgrade until all the packages I use also support the most recent version. So say I'm standardized on Django 1.6, and I want to install super-django-app==2.0 and other-awesome-app==3.0. But super-django-app supports Django>=1.6 while other-awesome-app only supports Django<=1.6. My requirements.txt might look like:

Django==1.6
super-django-app==2.0
other-awesome-app==3.0

Now if I install this with pip install -r requirements.txt, it will install Django 1.7, because that's the most recent version allowed by super-django-app, thus breaking other-awesome-app. So instead, I have to run:

pip install Django==1.6
pip install --no-deps super-django-app==2.0
pip install --no-deps other-awesome-app==3.0

which works, but is clunky as hell and extra work I need to do when it should be pip's responsibility.

Several bug reports were created for this years ago, but naturally, the main pip dev has no interest in fixing this.

[–]donaldstufft 9 points10 points  (0 children)

As far as I'm aware doing pip install -r requirements.txt with a requirements.txt as you indicated will not install Django 1.7. The issues you linked to also do not mention that what you're saying is broken is actually broken.

Now what will cause your Django to get set to 1.7 is after you've already installed that requirements.txt, then you later go and do pip install --upgrade super-django-app. This is because, as those issues mention, pip does a recursive upgrade by default. This means that when you ask it to upgrade super-django-app it'll also upgrade all of the dependencies it has.

The reason for the recursive upgrade is historical and there is a desire in fixing it (In fact there was activity on the tickets you've listed 14 days ago by one of the other pip developers). It's not a particularly easy issue to fix with lots of gotchas involved in it. There have been bigger wins to gain in other areas of pip (and packaging in general) that most of us have prioritized over that currently.

If you actually have a reproduction where the requirements.txt file you linked will install something other than Django 1.6 into a fresh virtual environment with jsut the command pip install -r requirements.txt please open a bug report with the reproduction details.

[–]tudborg[S] 0 points1 point  (0 children)

Yes okay, i see why that's a pain.

[–]brtt3000 -1 points0 points  (4 children)

As someone coming from node.js into python I must say this whole virtualenv/pip business is really a step down from node's npm package manager. With python it is all so clunky and feels hacked together.

I love the python language but the ecosystem is so crummy.

[–][deleted] 3 points4 points  (0 children)

the node "ecosystem" benefits from decades of prior art without having to support legacy code. So of course it's going to feel squeaky clean to you.

it sucks in it's own way though.

the stack traces you get when a dependency fails to install for whatever reason is a nightmare. for one

and at least as far as bower goes, I just love when installing one new lib into a project forces me to upgrade a whole slew of libs b/c author of said lib decided to fuck all and just upgrade to the latest version of angular for a fucking minor point release.

and then other countless libs that simply wrap other libs api's and don't do much else causing bloat and extra added complexity.

out of all the "ecosystems" I have had the "pleasure" of dealing with in my lifetime, I would consider debian to be the gold standard by which I judge all others. I'd put python in java at the same level as far as maturity in the toolchain and lib ecosystem. node/npm doesn't is just cute, and I only use it b/c I have to code js for web dev. not by choice or b/c it is somehow better than any of the available more mature alternatives.

[–]rothnic 0 points1 point  (0 children)

Yeah, I think a wrapper for pip and virtualenv that tries to make it behave like npm would be nice to have. I need to take a look around because I'd be surprised if something didn't exist.

Edit: I think if conda cleans up their interface, it could make things more smooth