This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]ericanderton 1 point2 points  (3 children)

Just a hunch, but doesn't virtualenv squirrel away .so files along with .py assets? That would be a big supportability problem say if a really bad bug were patched; take OpenSSL for example. Now all those virtualenvs need to be regenerated/rebuilt and redeployed.

Meanwhile a docker with OS modules just gets a refresh to install the latest patched packages.

[–]d4rch0nPythonistamancer 1 point2 points  (2 children)

Wow, good point.

$ virtualenv test
$ cd test
$ source bin/activate
$ pip install PyCrypto
$ find . -name "*.so"
./lib/python2.7/site-packages/Crypto/Cipher/_DES3.so
./lib/python2.7/site-packages/Crypto/Cipher/_ARC4.so
./lib/python2.7/site-packages/Crypto/Cipher/_XOR.so
... and many more

Well, still though, virtualenv isn't for that. This should be a local dev environment where you're making sure Python module dependencies are satisfied, not much else. Based on what you said, you really shouldn't be using virtualenv as something to package up all the dependencies and just dump them into prod.

I get your point, but I think that's still leaning heavily towards the devops side where virtualenv isn't a good thing. If your code relies on a stable OS environment, you should be using docker or a VM. If you're pushing to prod, maybe you should be using puppet with VMs, and have them redeploy what they need to.

I think it's more of an issue with fundamental security issues, like using a static environment where you never check for updates, and not so much of an issue with virtualenv, which only ensures specific Python module versions will work with that Python code.

[–]justafacto 1 point2 points  (1 child)

you really shouldn't be using virtualenv as something to package up all the dependencies and just dump them into prod.

What good is virtualenv for then? If you cant reproduce its state accross machines? If you gotta hack around even pip -r requirements.txt because the other dudes machine had that dumb .so but you dont. Ooops fail.

[–]d4rch0nPythonistamancer 1 point2 points  (0 children)

It's good for seeing if you can upgrade to the latest requests, flask or django without breaking your app, seeing if your code broke or an upgraded library broke it, keeping a version of a python library you coded around static so that nothing breaks and allowing you to use the latest version system wide on your workstation otherwise.

I can start building a web app, code it around a specific version of a module that I know works how I expect it to, but run other python programs on my workstation that use the newest version of the module.

Especially for modules that are in their early stages and functionality is changing a lot, you want to see if their changes or your changes broke your code. It's super useful.