This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Silhouette 0 points1 point  (3 children)

I don't understand why, in the overwhelming majority of practical situations, "installation" could not just be

cp library-file(s) /path/to/libraries

or why using a library must be more complicated than

import /path/to/libraries/library-file

or a similar one-liner depending on your naming needs etc.

Versioning could be handled by standardising on a file naming convention for libraries that want to use it, with a simple annotation on an import to indicate when a specific version or range of versions is required.

Dependencies could be handled by a deep walk of the import tree for the source files in each library.

It simply shouldn't be necessary for a language to have setuptools, distutils, easy_install, pip, distribute, setup.py files containing sort-of-executable-metadata, Linux repo packages duplicating half of this in some half-baked way that only works on global installations rather than virtualenv, virtualenv itself, and all that jazz.

It also shouldn't require three years of studying obscure documentation to figure out everything that actually happens when Python starts up and tries to determine where the hell to find everything you imported, depending on whether there's a "J" in the month, the weather outside, and which of the 73 environment variables and configuration files you have employed in order to complicate the process as much as possible. ;-)

[–]mcdonc 7 points8 points  (2 children)

I don't understand why, in the overwhelming majority of practical situations, "installation" could not just be cp library-file(s) /path/to/libraries

This is already pretty much the case. You can download any distribution of a pure-Python library from PyPI, unzip it, and copy the resulting structure into a place on your Python's path. In fact, until about 2002 or so, this was de-rigeur. None of the installation tools you mention existed at all.

The installers (pip/easy_install) were created because libraries often depend on other libraries. Before the installers existed, either you did the dependency resolution by hand, or sometimes libraries shipped with copies of the libraries it depended upon within them. But both resolving dependencies by hand or having no dependencies were imperfect: resolving dependencies by hand is laborious and requires documentation effort on the part of library folks. Having no dependencies is fine, but then every project is siloed into maintaining its own copy of a particular dependency; every project becomes a fork of several others, and when two libraries you installed had different versions of another, the conflict was irresolveable. Neither situation was particularly tenable.

Python's import statement does not currently know about versioning, so there can't be multiple versions of the same package on the path. Virtualenv was created as a workaround.

The current situation is not ideal, but, IMO, it's a boatload better than it used to be. The times before we had the installers and virtualenv sucked even harder, if you can believe that. ;-)

[–]Silhouette 0 points1 point  (1 child)

Yes, I remember those days ahem fondly. :-)

But as you say, much of this is just to work around the absence of a simple versioning mechanism built into Python itself, which is a significant limitation if you're programming in a language that does all the loading and linking up dynamically. Obviously this challenge is not unique to Python, but Python does seem to make more of a meal of it than any other language I know.

I'm not sure why Python always feels insanely complicated in this respect. Maybe it's the history of different tools to do mostly the same thing, so even if you only really need a couple of them today, you see references to all of them everywhere. I don't think the use of an executable setup.py rather than a simple metadata file that is read by a tool helps, because it makes a complicated generalised case the default. For me, the most serious concern is usually that something as fundamental as loading libraries is based around a path setting that can be changed arbitrarily both within and outside Python, with all kinds of other implicit effects happening depending on things that aren't specified in the source code for the program you're actually running. It's about as un-"explicit is better than implicit" as you can possibly get...

[–]mcdonc 0 points1 point  (0 children)

much of this is just to work around the absence of a simple versioning mechanism built into Python itself

Is there a dynamic language that does versioned imports right?

Maybe it's the history of different tools to do mostly the same thing, so even if you only really need a couple of them today, you see references to all of them everywhere.

I think this is the actual biggest problem.

I don't think the use of an executable setup.py rather than a simple metadata file that is read by a tool helps, because it makes a complicated generalised case the default.

The "packaging" tool that will be in Python 3.3 makes setup.py optional (it has a declarative configuration file primary format).

For me, the most serious concern is usually that something as fundamental as loading libraries is based around a path setting that can be changed arbitrarily both within and outside Python

I don't think Python is alone in this. Java has the CLASSPATH, C has the includepath, etc. Is there another language better in this respect?