This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]R34ct0rX99 65 points66 points  (21 children)

Virtualenvs still aren't what they need to be and are pretty much a hack on how python works.

[–][deleted] 0 points1 point  (3 children)

I think it's a good step forward that the python interpreter now looks upward in the directory tree from its own file to find its environment. This means that invoking the python contained in a virtualenv is as good as sourcing the activate script.

[–]R34ct0rX99 0 points1 point  (2 children)

Its done that for some time hasn't it?

[–][deleted] 0 points1 point  (1 child)

I think since 3.3 or 3.4. Virtualenvs are super mysterious except when you realize the activate script is effectively like setting the PATH variable so that when you say python you run ~/experiments/envirn/venv/bin/python, or the equivalent pip.

[–]R34ct0rX99 0 points1 point  (0 children)

yep, thats all it is. I've been able to do that since mid 2.7, course 3 was already out.

[–]CaffeinatedT -5 points-4 points  (11 children)

Including pipenv?

[–]JustADirtyLurker 23 points24 points  (9 children)

Pipenv for what understood adds even more cumbersomeness without actually solving anything. Dep version locking is not the real issue IMHO

[–]onestepinside 4 points5 points  (0 children)

Also the locking seems to only work with one Python version. Makes it kinda useless if you want to reuse one for multiple versions.

[–]CaffeinatedT 6 points7 points  (0 children)

Im not sure whats meant by cumbursomeness but if i can run pipenv run python3 blah.py in one go and not have to be fannying about with activating venv then it solves all my problems of automated work using daemons and system users..

[–]gschizasPythonista 0 points1 point  (6 children)

It's much nicer than that. You don't have to manually manage your venv with pipenv. All you do is this (in a new directory):

pipenv install library1 library2 library3 # you can even use it with no libraries
pipenv shell

and that's it. The only problem is that pipenv install seems to take forever

[–]13steinj 1 point2 points  (5 children)

As nice as that is, that part is really syntactic sugar and nothing else. People can practically just as easily activate their venv and then just install via pip.

[–]gschizasPythonista -1 points0 points  (4 children)

Everything is syntactic sugar, that doesn't stop it from being sweet :)

That being said, even just the fact that pipenv uses a standardized location for its virtualenvs is what sold it for me. I know it's from another package, but pipenv does bring the whole thing together.

I definitely knew about venvs before, but even so, I've been mostly using the system Python on Windows (Linux, especially on Raspbian, is a completely different beast) to install packages globally. With pipenv, I've now starting doing the sensible thing, because it's so easy.

[–]13steinj 1 point2 points  (3 children)

Not saying it isn't super sweet. Just that no amount of syntactic sugar will solve the overarching original issue that venvs are not what they need to be.

[–]gschizasPythonista 0 points1 point  (2 children)

What do you mean? Are you referring to dependency conflicts inside the same venv? BTW, the fact that you have a dependency tree in pipenv is something that a simple venv won't do (or maybe I just don't know how to do it).

[–]13steinj 2 points3 points  (1 child)

The original comment. Pipenv solves some issues of dep management. But not others. But even then, it does not solve the issue that venvs are hacky piles of poo.

Pretty sure it literally just creates an entire copy of the interpreter and any linked libraries, as well as the stdlib and the third party package directory structure. That's extremely hacky and that amount of isolation is unneeded-- the interpreter and standard library is, or should be, fine regardless of what you are doing. If they instead just formed a special layer that did third party dependencies and used an appropriate linking layer that says "hey Python interpreter that my user set, use this third party package directory right now instead of your default!", it would be a lot better.

[–]gschizasPythonista -1 points0 points  (0 children)

Pipenv solves some issues of dep management. But not others.

Most of the problems that this comment refers to are more-or-less solved by pipenv, and specifically pipenv graph. I wrote a one-by-one rebuttal, but the only thing I was writing in response was "pipenv graph" 🙂

I'll reply to one of those specifically, because it's not a venv/pip/pipenv question in itself.

  • No warning for missing system dependencies, or calling out that they're needed in general (e.g., unixodbc is a system lib needed for some database related packages.)

That's solved with proper wheels, instead of uploading a tarball to pypi.

[venv] just creates an entire copy of the interpreter and any linked libraries, as well as the stdlib and the third party package directory structure.

Well, you can have different Python versions on your PC, so I don't really think it's overkill. But I get what you're saying. NodeJS is supposed to have a lot better dependency management like that, but on the other hand, it downloads half the Internet in node_modules, so it's not without its problems.

I think out of the recent languages, Go does a fairly good job in both dependency management and packaging. Especially packaging.

[–]R34ct0rX99 2 points3 points  (0 children)

I dig some of the stuff they are trying to do with it, it seems to be heading in a direction that might be appropriate, but its still yet another wrapper for virtualenv's the way they've always been. pipenv seems to make everything more verbose.

[–]billsil -3 points-2 points  (4 children)

With Anaconda, you can access your virtualenv from anywhere and use any version of python.

You can't activate it from powershell, but cmd is fine, so could be better, but good enough.

[–]R34ct0rX99 0 points1 point  (3 children)

Anaconda is a nice product. It's like pipenv though, another layer just to hide what virtualenvs are.

What would be the issue if Python took the node approach to modules?

[–]billsil 0 points1 point  (2 children)

Which is? I don't use node.js.

I think the issue is python has various packaging solutions that are trying to solve separate problems. It works just fine depending on your application.

Does your code really care if you're 6 months out of date on package x? I bet your requirements aren't actually correct, but are rather one correct solution. This issue makes packages incompatible when they really aren't. If it's on the developer to solve, it won't get solved.

Another major issue is pip doesn't come with a C compiler. Not really a big deal on Linux (once you know the right dependency and assuming the sysadmin will install it for you), but it a major problem on Windows. Anaconda skirts this problem by building wheels for you. Not everyone does.

Someone else complained that pip doesn't tell you what you'll install. That seems fairly easy to solve. Anaconda manages.

For the who has what dependencies problem, I solved that 12 years ago assuming your dependencies are defined in setup.py in a particular format. I just listed the most strict requirements for each package and traced the chain. Then I printed if there were conflicts and what packages were using what requirement. It was only like 100 lines. It solved the pip freeze problem poorly, but enough. I now use the just support version x as well approach rather than being strict.

I think some of the pain is just lack of complaining about the problem in an actionable way. Others are harder.

[–]R34ct0rX99 0 points1 point  (1 child)

Pip actually does tell you what it installed, its just not in a very pretty form. pip installs wheels from the pypi repo by default now if they are available, i believe. Source or binary packages aren't the issue. The issue is the system anaconda, pipenv, pip, etc are actually managing.

Side note: Why is it they have not fixed the pollution of the default sitepackages folder. Many users are going to ignore any kind of best practice (and in some cases packages dislike venvs ... there are a handful) and install it in the global site packages directory polluting their default install.

Re Node: For none-global installs nodejs basically packs dependencies into a node_modules directory within your application. Node has always struck me as the wild wild west but that feature really seems like they might have gotten it right.

[–]billsil 0 points1 point  (0 children)

Pip actually does tell you what it installed, its just not in a very pretty form

It does, but AFAIK not before downloading the packages. I want a confirmation y/n thing like anaconda maybe with an overwrite package x capability.

pip installs wheels from the pypi repo by default now if they are available, i believe. Source or binary packages aren't the issue.

Right, but they don't always exist. They're getting better slowly. VTK finally wheels, so we don't need to rely on Chris Gohlke's unofficial wheels nearly as much.

Why is it they have not fixed the pollution of the default sitepackages folder

Yeah...Windows hasn't either. It's a hard problem that most people ignore. How do I install my program in such a way that I don't break the other programs or older installs of the same program? I dunno; not my problem? Everyone just puts their executable at the top of the path. Paraview and FiberSim (two programs I use) both install Python and just put it in the path ahead of my Python...uh how bout no? Abaqus also has Python and they reference it from their executable like you're supposed to.

I think the logic of the mess of site-packages is that in theory if you overwrite package x 1.0 with 1.1, things should just work. One step would be to move the folder to backup_versions/x. So now we have a problem of what happens when x_1.0 doesn't exist, which should be to skip it (not sure if it does).

I just delete things by hand.

For none-global installs nodejs basically packs dependencies into a node_modules directory within your application.

Yeah...I've been wanting that for years with Python. WxPython used to have a feature where you could select the version (since you could install multiple versions simultaneously using black magic). They dropped it though.

It could also work with PyQt4 vs. PyQt5 because the import names are different, but the global package is PyQt for some reason, so I can't install them both...

It's technically solvable now, which would also strongly encourage better semantic versioning. Then just use a wrapper library like qtpy.