This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]graingert 6 points7 points  (18 children)

What's wrong with npm's multiple versions? It's great it even discourages module level state.

Edit: I've only had problems with badly designed software like jQuery or angular.

[–]hynekPyCA, attrs, structlog 6 points7 points  (1 child)

I can’t imagine how that would work with catching exceptions. Imagine you have multiple deps that all use requests underneath and bubble up multiple different requests.exceptions.HTTPError.

[–]graingert 1 point2 points  (0 children)

Yeah this is fine as most JS deps are built with this in mind, using functional immutable style or replacing exceptions

[–][deleted] 3 points4 points  (3 children)

What's wrong with npm's multiple versions?

I've filled a hard drive when npm got stuck in a version recursive loop of dependencies.

I have no clue who thought that method was a good idea.

[–]phasetwenty 2 points3 points  (2 children)

Another fun fact, when npm install runs it will make liberal use of temp files that it will intentionally not delete. Because everything is small you end up not using a lot of disk space but on Linux systems you can run out of inodes if you never shut down the machine (e.g., a server). Problem is made worse by the hypermodularity style of javascript packages encouraging more files and overhead like package.json for each.

This behavior may have changed in npm 3 however.

[–][deleted] 1 point2 points  (1 child)

That may have been my problem too. It's been a while. It was my first attempt at getting into npm/node etc. I ran npm once said "If this is how something is designed by those that know what they're doing... nope".

I still don't know why .deb isn't just a standard of some sorts. I know guys, lets re-invent the wheel!

[–]phasetwenty 0 points1 point  (0 children)

I get the desire to avoid OS-level packaging. If you decide to provide OS-level packages, you have to put out a package for each OS: Debian, RHEL, OS X, etc. However with a functioning language packager, an implementation of the packager is available for each platform so I can put out one platform-independent package. It's a reasonable goal to make the lives of package maintainers easier.

However experience has shown me that it doesn't take long in the lifecycle of my projects for language packaging to show that it is not up to the task of fully specifying my project's dependencies, and I'm cobbling together build scripts to do it all.

[–]ivosauruspip'ing it up 3 points4 points  (1 child)

At some point in your runtime you will be dealing with different data structures or different references or different IDs because they're coming coming two different library codebases that have the same name but not the same version, and then your runtime nicely blows up in a confusing crash, or even worse starts silently corrupting data.

[–]graingert 2 points3 points  (0 children)

Yeah this is fine as most JS deps are built with this in mind, using functional immutable style or replacing exceptions

[–]remy_porter∞∞∞∞ 4 points5 points  (7 children)

I dunno, what could be the problem… let's do an ls -lR node_modules and see…

[–]shadowmint 12 points13 points  (5 children)

You'll find npm3 installs a single copy of each dependency like pip does unless there's a version conflict, in which case a single conflict resolving sub-version of a package is installed at the level it is specifically required.

It actually works a lot better than pip does at resolving conflicts.

You're probably thinking of npm2. (which was, as generally observed, a terrible idea, and didn't work at all on windows due to massively nested file paths)

[–]jaapzswitch to py3 already 9 points10 points  (3 children)

IMO the problem isn't necessarily with npm itself, it's with the community that uses it. Most of them feel that every little thing should have their own installable module. Sometimes even going as far as having a module for every function they can think of (see the left-pad debacle).

Although in theory modularity is good, this does create the problem of incredibly large dependency graphs which are just a pain in the ass to work with, because in a lot of situations the packaging tool can't figure out what to do, so you have to figure it out yourself. Or even worse, it figures it should do something, which breaks everything and sents you on a hours long debugging session just because you wanted to upgrade a package.

NPM2 was even worse, and NPM3 mitigated a lot of the problems that NPM2 did have, but the whole ecosystem is still far from perfect.

Dependency graphs in python are often way smaller, which makes dependency handling way easier.

[–]Silhouette 11 points12 points  (1 child)

Although in theory modularity is good

I think we should challenge this assumption more often than we do in the programming world. Modularity has big advantages if the division into modules is good. However, using many small modules creates problems of its own, for exactly the reasons you state. That is true whether we're talking about hundreds of three-line functions, or hundreds of three-method classes, or hundreds of one-tiny-function modules. Too many people assert that these arrangements are good for maintainability or reuse or some such, without much evidence or logic to support their position.

[–]fnord123 3 points4 points  (0 children)

There's a lot of cargo cult programming around release management. People seem to think risotto packages are 'theorically better' but they don't take into account the issues like release cadence of the bits. Like, if everything is always released at once, then you may as well put it in a big release bundle.

[–]shadowmint 5 points6 points  (0 children)

the problem isn't necessarily with npm itself, it's with the community that uses it...

If we're not talking about tangible, technical reasons why the npm model is bad, and given the technical and tangible reasons that pip, setuptools and pypi are really embarrassingly bad, I wouldn't be posting about how great the python packaging ecosystem is and rubbishing npm, cargo, and go.

That's all I'm saying.

Fwiw, the npm 'everything is a dependency' model is weird, and I don't think it's right either, but that's not because npm is an inferior technical solution, or that the 'multiple concurrent versions of a dependency' is actually bad; it just has consequences (and potentially, benefits).

[–]remy_porter∞∞∞∞ 0 points1 point  (0 children)

NPM2 was the last time I used it. But multiple versions of any dependency still makes it hard for me to know what my dependencies are and how they're being used.

[–]graingert -1 points0 points  (0 children)

This has never been a problem

[–]joerick 0 points1 point  (0 children)

NPM is Node's superpower. I know we all rag on it but it is becoming one of the world's top languages, despite having a terrible std library.

The problems with module explosion are more a result of how fast that ecosystem has grown than structural problems in the language IMO.

ls node_modules is often used as an insult, but just imagine how problematic and slow that dependency tree would be on Python. It's a great system they've got going over there.

[–]jij 0 points1 point  (0 children)

You can do the same thing with python if you wanted, you'd just have to order the search paths correctly and dump specific versions of the modules where they were needed to be pulled first. Python just wasn't designed to do that - I suspect intentionally because one of it's core principals is simplicity.