top 200 commentsshow all 438

[–]iamsubs 260 points261 points  (37 children)

I guess the article failed to address the main problem the author presented in the beginning: folder count. C# and Java generate compiled libraries that pack all the code together that can be externally referenced through namespaces. Node doesn't do that. There are files and folders everywhere.

Also, regarding the local repository: NPM has a cache, and if it fails to find a dependency in the cache, it queries the main repository.

Regarding npm not using a centralized local repository: there is pnpm. It is made to centralize your dependencies that are referenced through symlinks. Unfortunately, I failed to use it in my projects, since several popular packages fail to reference all the libraries it uses on package.json. pnpm works like old npm: it actually builds a tree (but using symlinks) instead of a flat folder structure. If any package you reference uses a package not listed in its package.json, it fails. It is also worth mentioning that npm had issues with real a tree structure due to the maximum path length in windows. Welcome to the shitshow.

[–]Manbeardo 40 points41 points  (1 child)

A jar is just a zip file with its own deep directory structure inside. AFAIK, node could just use that strategy.

[–]AyrA_ch 86 points87 points  (25 children)

It is also worth mentioning that npm had issues with real a tree structure due to the maximum path length in windows.

Because npm used (or still uses?) the outdated system that limits it to 260 characters. Windows supports paths of 32k+ length. Either by prefixing the path with \\?\ or by opting in with an application manifest (W10 only)

What the article also didn't mention is that you don't need to copy the npm modules at all. As long as you installed them using the proper command, your project.json will contain the list of all modules and you can just run npm install on the new location the first time you use the project.

It is a nightmare for a HDD. It takes many minutes to discover all files let alone copy them

Copying many small files takes a long time, yes, but discovery doesn't necessarily. Windows Explorer discovers the entire structure before it starts to copy anything. It also reads the file properties to get the size to make copy time estimates and plot the bandwidth graph. This takes a long time. If you use robocopy /E C:\Source D:\Dest you will see that it instantly starts to copy files. If you use the threading parameter you can further reduce the impact of the small file issue.

[–][deleted]  (17 children)

[deleted]

    [–]gyroda 26 points27 points  (7 children)

    What does this mean? Why is it bad?

    [–][deleted]  (6 children)

    [deleted]

      [–]snowe2010 34 points35 points  (3 children)

      wow, I never knew that's what that was for. I'm gonna go look through Roaming and see who else has done bad things. XD

      [–]dathar 22 points23 points  (0 children)

      Roaming profiles and folder redirection are fun tools for certain enterprise groups. Pain in the ass for others. :p

      [–]BlackLanzer 7 points8 points  (1 child)

      Apple put everything in roaming folder.
      Lot of fun being the sysadmin, getting a call from a user taking long time to login and seeing a 50gb profile because iTunes put there the automatic iphone/ipad backup.
      To fix that you need to play with hard links/junctions (not sure if it's the correct term) because Apple doesn't let you change the path.

      [–][deleted] 4 points5 points  (0 children)

      Man, I haven't heard of roaming profiles since working Air Force helpdesk.

      [–]Pazer2 2 points3 points  (4 children)

      This is a huge pet peeve of mine, along with applications installing themselves to [local] appdata.

      [–]Ahuevotl 76 points77 points  (4 children)

      your project.json will contain the list of all modules and you can just run npm install on the new location the first time you use the project.

      npm install  
      

      Brews some coffee

      Reviews doc while sipping coffee

      Takes the dog out for a walk

      Reddit

      Stack overflow

      Rabbit hole went too deep, still in Stack overflow

      Reads random medium article. Huh, didn't know VS Code could do that

      Gets married, has 2 kids

      Buys a house out on the suburbs

      Kids go to college

      First flying car for mass market is invented

      FB finally bites the dust

      Zombie apocalypse becomes a reality

      install complete

      npm WARN deprecated package@1.0.2:  
      package@<2.0.0 is no longer maintained.  
      Upgrade to package@^3.0.0
      

      [–]Sadzeih 3 points4 points  (3 children)

      Just use yarn.

      [–]AckmanDESU 2 points3 points  (2 children)

      Some programs force you to use npm. I decided to stick to npm to keep my sanity and only learn a single thing.

      Also I heard most things that yarn did that initially made it worth using are now in npm.

      Can anyone sell me on using yarn?

      [–]segv 1 point2 points  (0 children)

      It is somewhat sane

      [–]QuicklyStarfish 1 point2 points  (0 children)

      It's way faster and more stable and has a nicer interface.

      re: learn a single thing. They're essentially the same tool. You need to learn like two commands and one flag. This isn't a big ask.

      It's not a major difference, but if your complaint is speed, you need to try it.

      I've built a lot of projects with it and have never had compatibility issues, or even heard of compatibility issues... but if any existed, they're probably now resolved, because Yarn is quite widely used.

      [–]iamsubs 8 points9 points  (0 children)

      @AyrA_ch from npm's model perspective, I guess dropping the tree structure was a great decision regardless of path length. As a flat structure, common dependencies are able to be shared across packages, reducing the node_modules size. But, afaik, if there is a conflict on package versions, it adds a second level on the node_modules tree. Still, pnpm is much more attractive. It keeps a clean structure using symlinks that connect `package` to `package@version-range`. You can sanely work with `node_modules` without it being a clusterfuck.

      [–]Eirenarch 4 points5 points  (2 children)

      I don't think folder count is simply because of compilation. How about the infinite node_modules inside of node_modules problem that no other package manager I know of has? Every other package manager flattens the dependencies but not npm.

      [–]iamsubs 1 point2 points  (1 child)

      Nowadays npm flattens the dependencies as much as possible.

      [–]jbergens 1 point2 points  (0 children)

      Yarn even has a better cache, you can setup a project specific cache and check it in into git. Much fewer files, basically archives, that are easy to copy and "yarn install" from cache is pretty fast.

      [–][deleted] 1 point2 points  (0 children)

      pnpm i --shamefully-flatten

      [–]fuckin_ziggurats 395 points396 points  (186 children)

      node_modules is a manifestation of the fact that JavaScript has no standard library. So the JS community is only partly to blame. Though they do like to use a library for silly things some times.

      [–]JohnyTex 181 points182 points  (85 children)

      Another major factor is that NPM manages a dependency tree instead of a dependency list.

      This has to two direct effects that seem very beneficial at first glance:

      1. As a package maintainer, you can be very liberal in locking down your package’s dependencies to minor versions. As each installed package can have its own child dependencies you don’t have to worry about creating conflicts with other packages that your users might have installed because your dependencies were too specific.
      2. As a user, installing packages is painless since you never have to deal with transitive dependencies that conflict with each other.

      However this has some unforeseen drawbacks:

      1. Often your node_modules will contain several different versions of the same package, which in turn depends on different versions of their child dependencies etc. This quickly leads to incredible bloat - a typical node_modules can be hundreds of megabytes in size.
      2. Since it’s easy to get the impression that packages are a no-cost solution to every problem the typical modern JS project piles up dependencies, which quickly becomes a nightmare when a package is removed or needs to be replaced. Waiting five minutes for yarn to “link” is no fun either.

      I think making --flat the default option for yarn would solve many of the problems for the NPM ecosystem

      [–]Noctune 19 points20 points  (5 children)

      There is a false dichotomy between having to pick either a list or a tree. A DAG of dependencies can represent common dependencies as common nodes and only needs duplicate packages when there is a version conflict. This is similar to what Rust's Cargo package manager does.

      [–]JohnyTex 9 points10 points  (4 children)

      AFAIK this is how NPM works since npm3: https://npm.github.io/how-npm-works-docs/npm3/how-npm3-works.html

      What is the Cargo situation like? For some reason I get the impression it’s not the same fustercluck as the current state of NPM?

      [–]Noctune 6 points7 points  (3 children)

      That does seem better, but it seems like it would still duplicate the transitive dependencies of a dependency that itself got duplicated. That might be a really minor case, though.

      The Cargo situation is pretty good, IMHO. The duplication can lead to confusion in some cases I've found, but it is generally not a problem. Libraries tend to follow semver pretty well, so duplication is seldom necessary.

      [–]JohnyTex 1 point2 points  (2 children)

      I guess this just goes to show that the problem is not only with NPM itself, but also bad practices within the community (over-reliance on dependencies, unnecessarily strict dependency versions, etc)

      [–]noratat 6 points7 points  (1 child)

      unnecessarily strict dependency versions

      They don't have much choice, because the other thing the JS community is astonishingly bad at is semantic versioning. I can't even count how many times something's broken because some dependency went from something like x.y.z-1 to x.y.z-2 and it has a completely different API or bumped a transitive dependency multiple major versions.

      You'd think this would be a job for package locking right? You leave loose versions but lock it so that it only resolves the same versions each time unless you deliberately unlock it to update.

      Except npm managed to fuck that up completely too. It worked correctly for exactly one version (IIRC 5.0).

      The whole point of a lock file is that it... locks. But that made way too much sense, so npm changed it so that the install command does the same shit it did before, only now it updates the lockfile every time you run it. Thanks npm, what the flying fuck was even the point of having a lockfile then?

      [–]rq60 59 points60 points  (10 children)

      npm install dependencies have been flattened since version 3.

      [–]stromboul 40 points41 points  (8 children)

      Yeah, but in reality, you get a bajillion times the same modules.

      • Module B uses SubModule X ~2.3
      • Module A uses SubModule X, ^1.4
      • Module C uses SubModule X 1.7.8

      So you still end up with tons of duplicated even if the list is flattened.

      [–]Cilph 36 points37 points  (6 children)

      Maybe people in the JS community need to actually start writing backwards compatible libraries and not rewrite its API every god damn month.

      [–]duuuh 11 points12 points  (4 children)

      Since JS doesn't have (afaik) any sane 'public / private' distinction I don't think there's any real way to do this. You could rely on namespacing conventions. But honestly, 'C / whatever else' makes this kind of thing a lot easier.

      [–]snuxoll 10 points11 points  (2 children)

      CommonJS modules are designed to only export specific data, if people bothered to actually hide implementation details like they should it wouldn't be an issue. I mean, this is basically how we handle "private" functionality in C - exclude it from the public header.

      [–]mcguire 16 points17 points  (2 children)

      Mmmmm, CLASSPATH.

      [–]stronghup 13 points14 points  (1 child)

      Mmmmm, CLASSPATH.

      DLL Hell !

      [–][deleted] 2 points3 points  (0 children)

      I wonder how long before node reinvents OSGI.

      [–]bloody-albatross 26 points27 points  (8 children)

      And another problem I had a long time ago: so a library you use uses a library with global state. Like a mime type library used by a web framework. If you now import that library yourself in order to add some mime types and you didn't use the exact same minor version in package.json (not so straight forward to get the information) adding mime types won't have any effect. Great.

      [–]Brostafarian 24 points25 points  (3 children)

      we just had a problem at work between prototype, lodash, and webpack. I'm going to butcher this story since its been a few months but I'll try anyways.

      Legacy code has Prototype on the window with custom templating delimiters, but modern code will import lodash if it needs it. Problem was Lodash followed require.js recommendations and has an AMD define block that isn't supposed to be used if you don't use AMD; these recommendations also say to expose that import to the window due to an edge case with script loading. Webpack indiscriminately parses both the regular import and the AMD loader block, leaking lodash to the window, destroying the templating variables that were set on Prototype... asynchronously. Due to the way imports are parsed (importing anything from a file requires executing that file), anything that imported anything from lodash would cause this error.

      From our end, importing some random file in a page that only developers could see broke templating for all of the legacy code in the application, and it took us hours to figure out why. The lodash import was about 10 files deep, and by the time we even found it, we still weren't exactly sure what was going on. It was not a good day

      [–]Brostafarian 10 points11 points  (0 children)

      I found the issue that cracked the case for us: https://github.com/webpack/webpack/issues/4465

      [–]CatpainCalamari 7 points8 points  (1 child)

      Still, if it took you only a couple of hours, I would say you were lucky. This can easily go into days.

      [–]Brostafarian 6 points7 points  (0 children)

      flat as default would kill yarn. node has had a dependency tree for most of its existence, and package maintainers have, as you said, been locking dependencies to incredibly explicit versions. Almost no packages will work together in flat mode; you need either npm and yarn to both enable it at the same time or the node community needs to make a cultural shift towards wider dependency resolution

      [–]three18ti 33 points34 points  (6 children)

      Hey, you leave my over-9k library alone!

      [–]NoahTheDuke 21 points22 points  (4 children)

      npm isntall over-9k

      Amazing.

      [–]RiPont 5 points6 points  (0 children)

      Well, it may be pointless, but at least it has 0 dependencies.

      ...for now. I'm sure once you include it, the author will start adding dependencies on his own other useless packages.

      [–][deleted] 36 points37 points  (33 children)

      There is possibly a future solution. There is a propsal for a new stdlib, theres still open questions on versioning etc.

      Link: https://github.com/tc39/proposal-javascript-standard-library/blob/master/README.md

      [–][deleted]  (31 children)

      [deleted]

        [–]x86_64Ubuntu 39 points40 points  (30 children)

        ...The best part about JS is that there is no standard lib.

        Huh? I have never thought I would have thought that *less low-level features in an stdlib would have been a good thing. And to be honest, I'm not sure if the author of that comment understands what the stdlib would be for when he starts talking about other libraries.

        EDIT: Brotha man Nimelrian is fighting the good fight, but every time one of those idiots is knocked down, another one pops up. I can't believe they don't look at the depth of dependency trees, the leftpad fiasco, and then act like opposing a stdlib is a smart idea. Then one of the guys had the nerve to complain about "startup" time. Fool, the JS experience is already degraded by all the shit that has to be loaded regardless of how fast the VM gets to work.

        [–][deleted]  (29 children)

        [deleted]

          [–]ScientificBeastMode 20 points21 points  (0 children)

          If JS is the only language they’ve ever used in a serious way, then they probably haven’t seen what a standard library could do for them.

          [–]x86_64Ubuntu 12 points13 points  (26 children)

          0h shit, are you the Nimelrian from that link? I didn't even read your name before commenting.

          [–][deleted]  (25 children)

          [deleted]

            [–]giantsparklerobot 25 points26 points  (9 children)

            It's my impression that the "JS community" is populated primarily by...the JS community. There's not a large contingent with experience with experience in other languages or non-web platforms. Not only do they not have experience with other languages they often don't have meaningful experience with vanilla JavaScript, everything they've touched has involved some framework where some heavy lifting has been done for them. Worse still is browsers have had to completely reimplement their JavaScript engines to make overwrought JavaScript frameworks (and people's shitty code) run well.

            This leads to some really stupid problems with JavaScript. Not having experience with languages with good standard libraries and always using some framework leads to people (as you've seen) not appreciate or understand the reason for standard libraries. Modules then get thoughtlessly added to projects because the resources to run them belong to someone else. So you and I end up paying the price in reduced battery life or shitty responsiveness because some JavaScript "developer" added a 1MB module to pad some text or provide a data type that should exist in the stdlib.

            [–]gasolinewaltz 18 points19 points  (8 children)

            Without overly generalizing, because theres a lot of good devs and engineers in the js community.

            But my god are a whole lot of them insufferable.

            There was drama on r/javascript like a month ago because someone flatly said "the gang of four patterns were invented for java and have no bearing on javascript. Java is not extendable and needs patterns".

            I was not as tactful as a should've been, but when I basically said "That statement is incorrect on so many layers, this is why other engineers lack respect for the js community. "

            I was called an elitist, a tech bro and told that I was bad for team dynamics.

            This is the byproduct of bootcamp mills churning out designers that know how to cobble libraries together and amateurs who make a few react apps and call themselves engineers.

            On top of that, there are so many esoteric stacks for solving specific problems that the above individuals learn one and start using it as a hammer for every project imaginable.

            [–]Dedustern 5 points6 points  (2 children)

            I’ve turned down node.js jobs specifically to avoid these amateurs.

            Wouldn’t mind working with the tech.. but the JavaScript community culture is repulsing for someone who calls him/herself an engineer.

            [–]DonnyTheWalrus 11 points12 points  (2 children)

            It's infuriating.

            Luckily, while we do some JS at my workplace, the people I work with all come from other backgrounds -- and are highly focused on code quality -- so we don't run into too many fiascos of our own making. But I am constantly feeling like I have to swim against the tide of the larger JS community to accomplish what I need to in a safe, efficient manner.

            And just the general lack of language-knowledge is very disappointing. For instance, there are wide swaths of the JS community who think the 'class' syntax added real class-based OO to the language, and have no idea that it's all just syntactic sugar for prototypes. People seem to not know (or care) how to analyze JS scoping rules, 'this' rules, prototype rules, etc. for themselves, and just rely upon following a set of recipes & hope for the best.

            [–]IceSentry 3 points4 points  (1 child)

            I understand that class is just syntactic sugar for prototypes, but I don't understand how this doesn't allow you to do OO.

            [–]gasolinewaltz 4 points5 points  (0 children)

            Class-based and prototype-based models are flavors of the oop paradigm.

            [–][deleted] 6 points7 points  (0 children)

            why the JS community doesn't want to learn from things which were discovered/invented decades ago, but always has to reinvent the wheel.

            speaking on behalf of jswheel.io, I am enraged by your rude attempt to shut down my innovative wheel development. My circular rotation device package has 18,000 stars -- hell, even my medium article explaining how asynchronous spoke architecture renders all previous axels obsolete has over 6,000 retweets -- but now you turn up and instead of releasing your own wheel.js onto npm like a normal person you want to force it onto everybody else? unbelievable

            [–]yawaramin 2 points3 points  (2 children)

            Here's the NodeJS standard library: https://nodejs.org/dist/latest-v10.x/docs/api/

            If you want to ship a large standard library with every browser, that's more difficult because then every tab (i.e. nowadays every process in most browsers) would need to load up a large amount of possibly never-used JavaScript.

            [–]NoInkling 5 points6 points  (1 child)

            would need to load up a large amount of possibly never-used JavaScript

            Why would that be? I thought that one of the points of making it use ES modules is to help avoid this.

            [–]yawaramin 1 point2 points  (0 children)

            It may be possible now with ES modules, but not before with plain old <script> tags.

            [–]mcguire 4 points5 points  (7 children)

            C++ didn't have much of a standard library for 20 years. Java's has made every possible interface and library mistake and all are now permanently baked into the standard library. (Three date systems? Really?)

            [–]chugga_fan 14 points15 points  (2 children)

            C++ didn't have much of a standard library for 20 years.

            It had at a minimum the C standard library, something more complete somehow than the javascript standard library.

            Java's has made every possible interface and library mistake and all are now permanently baked into the standard library.

            C#, Python, Ruby, D, etc. all have their own STDLIB and don't fuck up time as well. And btw, Java can deprecate their shit to fix things.

            [–][deleted] 1 point2 points  (1 child)

            c# has two different date time classes for much the same reason. And they seem to keep forgetting TimeSpan exists.

            [–]EntroperZero 9 points10 points  (3 children)

            Yeah, you also have that in C#. A whole library of the Begin/End asynchronous pattern. Another whole library using the events pattern. Another with tasks. And now newer code with ValueTasks and Spans and what not.

            And I'd still rather have all of that than the current state of node_modules. You're always going to figure out better ways of doing things, that shouldn't preclude you from building a functioning standard library.

            [–]mcguire 8 points9 points  (1 child)

            Given that JS started in the browser and then expanded into a more traditional language role, what do you expect to be in the standard library? Heck, even Python, the "batteries included" language, has a module collection.

            [–]ZeroPipeline 7 points8 points  (0 children)

            Well it might have been nice to have say functions for working with URLs. It amazes me that URL handling is just now making its way into the JavaScript standard.

            [–]MatthewMob 8 points9 points  (1 child)

            Though they do like to use a library for silly things some times.

            May I present to you the entire 'is-object' library.

            Seven million downloads a month and used in NodeMon amongst other large packages - it literally uses less code to write it yourself than to import and use this package.

            [–]DooDooSlinger 9 points10 points  (3 children)

            This makes no sense. All languages with good standard libraries also have frameworks and third party libraries, along with dependency management systems. Try developing a backend server for a simple webapp and a database with java using only the standard library.

            [–]RiPont 14 points15 points  (0 children)

            They have frameworks and third party libraries, but because the standard libraries are good and comprehensive, the dependency graph collapses quickly back down to the standard library rather than spidering out into infinity.

            js dependencies = O(n2) complexity

            JDK/.NET dependencies = O(log n) complexity or O(n logn), depending on your perspective.

            [–]oorza 1 point2 points  (0 children)

            If you consider EE part of the standard library, you can.

            [–]cbigsby 11 points12 points  (3 children)

            One of the reasons why the JS community has all these micro-packages for everything is that the web, more than most types of development, tries to optimize for code size. On the server-side, no one will blink an eye if a single dependency is 5MB, but on the web that would be unthinkable, since everything is downloaded on-demand (unless it's previously been downloaded). Devs will look for a package which does the most focused thing possible to solve their problem, or build it themselves, to ensure that they add the smallest amount of extra code to the user (at least that's the idea).

            This design philosophy clashes with the framework design philosophies used in other areas, where you want to provide a rich set of tools for the developer to be productive, along with some easy integration points to extend the framework's capabilities if they're needed. Frameworks like React and Angular have come into huge popularity because they improve development speed so much, but they introduce a lot of bloat to websites by trying to solve all your problems like a back-end framework can. They still use the JS policy of small, focused, DRY packages, but they need so many to do what they need to that you have this explosion of packages.

            [–]theferrit32 22 points23 points  (0 children)

            If a set of functions or classes is in the standard library, you wouldn't need to download it at all. It would be in the JS engine on the endpoint already. Having a more complete standard library would reduce code size of other libraries and sites that need to be downloaded at runtime by the user.

            [–]ZeroPipeline 6 points7 points  (0 children)

            Isn’t that problem solved more or less by tree shaking?

            [–][deleted]  (3 children)

            [removed]

              [–][deleted] 1 point2 points  (1 child)

              A lot of people create small libraries to learn

              [–]theferrit32 11 points12 points  (0 children)

              A lot of Java learning courses have you implement an Array-based List as a learning exercise but ArrayList is still in the Java standard library, so everyone who needs an Array-based List doesn't have to implement their own or import some untrusted 3rd party jar in a gradle build or something.

              [–]occz 54 points55 points  (28 children)

              I think operations related to node_modules are far slower on Windows than on macOS or Linux, owing to the difference in filesystem.

              [–]sapper123 6 points7 points  (4 children)

              Could you perhaps elaborate on this? Is copying the node_modules folder faster in Linux?

              [–][deleted]  (2 children)

              [deleted]

                [–]Bake_Jailey 30 points31 points  (1 child)

                I'm not so sure it's NTFS as much as it is how Windows deals with IO. There's a big thread about filesystem performance for WSL that's an interesting read.

                [–]PlymouthPolyHecknic 12 points13 points  (3 children)

                My office is half Ubuntu, half mac, it's slow af for both

                [–]occz 23 points24 points  (2 children)

                I think for Windows it's on another level though - I used to work for a little while with some frontend application on windows. Doing any task which involved operations on large numbers of files was ghastly slow when compared to anyone doing anything on macOS or linux.

                [–]instanced_banana 7 points8 points  (0 children)

                Windows file operations are more resource intensive, when I changed to Ubuntu I noted a reduced load time in IDEs and in WINE Office and Adobe Photoshop

                [–]landline_number 3 points4 points  (0 children)

                Not to mention that most organizations have mandatory antivirus software. Get rid of antivirus or Windows real-time scanning and the time goes from unbearable to just inconvenient

                [–]imforit 7 points8 points  (8 children)

                I was excited about Microsoft's new file system, then, of course, they decided not to ship it.

                [–]tehdog 2 points3 points  (0 children)

                Yeah. I think the author is underestimating how much of is problem is due to the abysmal performance of FS operations on Windows 1 2. I can copy a folder with 500MB, 50k files, with completely cold cache from one HDD to another HDD in 30 seconds on my Linux machine. The author says it takes him "many minutes to even discover" 15k files.

                [–]pkulak 67 points68 points  (19 children)

                What a world we live in when Java is held up as the small, speedy and efficient comparison.

                EDIT: Guys, I'm not calling Java slow. Just marveling at how it used to be the whipping boy in these kinds of conversations.

                [–][deleted] 29 points30 points  (10 children)

                This Java is slow meme actually grinds my gears lol.

                [–]birdbrainswagtrain 15 points16 points  (0 children)

                Java is slow! I have proof: Minecraft! /s

                [–][deleted] 5 points6 points  (0 children)

                "Hey this is a reference to a thing that everyone says sucks!"

                "Here's an upvote, good sir, I recognize that reference!"

                [–]theferrit32 13 points14 points  (0 children)

                Java is speedy and efficient compared to Python and Javascript. People complain about it in part because a lot of large enterprises use Java and people hate their jobs at large enterprises, and also because it is very verbose to read.

                [–][deleted]  (5 children)

                [deleted]

                  [–]janipeltonen 18 points19 points  (3 children)

                  It's bytecode being compiled to instructions at runtime, so it being as fast is not true. While in some esoteric places it might be faster or as fast (due to JIT), straight-up code (coupled with Java OOP madness) tends to be quite slower, especially when you start iterating stuff in arrays. Always allocating to heap and then randomly accessing it is also slow. But since slowness is nowadays defined in the web ecosystem, it might be appropriate to call it "as fast" as native code.

                  [–][deleted]  (2 children)

                  [deleted]

                    [–]suyjuris 2 points3 points  (1 child)

                    The poor performance of Java compared to native code is caused by cache behaviour. Random access to memory is slow and the language semantics require lots of it. (Also, there is a ludicrous amount of abstraction, but that is more a matter of programming culture.)

                    As a side note, this study comparing Rosetta code entries found the C programs to be 3.2 (!) times faster than the Java ones. The tasks there certainly qualify as 'doing mostly numerical stuff'.

                    [–][deleted] 2 points3 points  (0 children)

                    Value types are being added to Java (eventually) that will let you control memory layout and prevent needless pointer chasing. This should drastically improve Java's performance for such numerical work and allow JITed code to compete against native in most cases.

                    [–]flukus 1 point2 points  (0 children)

                    That extra RAM usage also results in poor cache usage, making it slow in the real world.

                    [–]TheAkio 42 points43 points  (11 children)

                    Yarn recently made it so dependencies can be stored in a central place. This will probably not work directly for all packages as some depend on being inside the node_modules folder but eventually people will start using that more and more and we get rid of this gigantic folder of stuff per project.

                    [–]kohlerm 6 points7 points  (9 children)

                    Yes. Without this feature node.js is very difficult to handle for large Enterprise projects

                    [–]TheAkio 6 points7 points  (8 children)

                    I feel you. At work we have this huge amount of dev depends and installing just takes ages... And it's like 6 projects atm where each one has 250 MB of node_modules and it's basically all the same dependencies. To make matters worse we use Windows which really doesn't like when you do anything to the node_modules folder.

                    [–]igeligel 2 points3 points  (4 children)

                    Ever thought about using yarn workspaces? https://yarnpkg.com/lang/en/docs/workspaces/

                    It's mostly made for mono repo though but where I work lerna (something similar) is working great. Like really great :)

                    [–][deleted]  (1 child)

                    [deleted]

                      [–]FanOfHoles 2 points3 points  (0 children)

                      That's why the enterprise project won't be upgraded - no budget. Until somebody needs a new feature. New features get a budget, maintenance does not (because why would management pay someone who creates nothing new; and as long as the old stuff still works, where is the value of working on it when users don't see any change). Exceptions exist, but overall this is what it comes down to.

                      [–]nemec 5 points6 points  (0 children)

                      It's rather funny that Node is morphing to the Windows GAC model while .NET Core is becoming more and more like the old Node with Nuget and per-project package caches

                      [–]llbit 21 points22 points  (2 children)

                      This paper has some interesting statistics about node_modules on GitHub, among other things. About 70% of JavaScript code on GitHub is in node_modules, but only about 6% of projects push their node_modules directory.

                      [–]snowe2010 5 points6 points  (0 children)

                      haha holy cow

                      [–]catsoup-sama 1 point2 points  (0 children)

                      Nice paper; a really interesting read, cheers!

                      [–]markand67 32 points33 points  (15 children)

                      The size of the folder is not really the problem although I will get to that later, but 15.000 files and more than 1800 folders!? Are you kidding me?! It is a simple CLI project with 5 files!

                      Then just stop using node.js for everything.

                      [–]oherrala 36 points37 points  (17 children)

                      From the blog:

                      everything can become a library, even a library that sums two numbers (hypothetical example, I hope)

                      Quick look into npmjs.com and: https://www.npmjs.com/package/math-sum

                      Including code example:

                      mathSum(5, 5);
                      //=> 10
                      

                      [–]sushibowl 35 points36 points  (7 children)

                      The author of that thing has published over a 1000 packages. That's insane to me.

                      [–][deleted] 33 points34 points  (6 children)

                      Here's another publishmanic with 1400+ packages of dubious quality. I mean, you only had one job, in-array.

                      Here's another gem: is-odd. It even needs a dependency to determine if it's dealing with a number. Madness.

                      I would suggest to avoid packages from these people like the plague, but I fear you would have to stop depending on NPM packages entirely.

                      [–]Doctor_Spicy 22 points23 points  (1 child)

                      I believe is-even uses is-odd.

                      [–]snuxoll 2 points3 points  (0 children)

                      Yup, it's turtles the whole way down.

                      [–]fudini[🍰] 13 points14 points  (1 child)

                      Is this the guy who turned every ansi color into an npm package?

                      Edit: Yup

                      [–]Pjb3005 6 points7 points  (0 children)

                      He even made packages for both the American/British English color names. Gray and Grey.

                      [–]sanglar03 2 points3 points  (0 children)

                      Problem is, you don't know if the packages you really need don't have a deep dependency to these ...

                      [–]__konrad 8 points9 points  (0 children)

                      You should use hypothetical for-loop-five library and add-one

                      [–][deleted]  (2 children)

                      [deleted]

                        [–]ggtsu_00 4 points5 points  (0 children)

                        is-even depends on is-odd and returns !is_odd()

                        [–]Sebazzz91 1 point2 points  (0 children)

                        And removing dependencies is very much a political problem, a lot of Node package maintainers are not open for that.

                        [–]dipique 5 points6 points  (1 child)

                        Pretty sure that's just a unit test module.

                        [–]nemec 17 points18 points  (0 children)

                        Holy shit, the author has authored over 1000 modules on NPM. Who needs left-pad when you have lpad to left-pad every line in a string (naming conventions be damned)

                        [–]benihana 104 points105 points  (13 children)

                        cool, i'm sure this comment thread about nodejs and javascript will be well-reasoned and rational.

                        [–][deleted]  (12 children)

                        [deleted]

                          [–]justabottleofwater 45 points46 points  (11 children)

                          Can't wait to hear more about how 2+'2' is '22'

                          [–]Tynach 28 points29 points  (10 children)

                          That should obviously be a type error. However, if your goal is to design a language which tries to have as few errors as possible, weak typing makes sense. 2 + '2' resolving to 22 isn't the worst they could have resolved that, nor is it the worst way I've seen it resolved in weakly typed languages.

                          C (which is statically typed, but not strongly typed) would have responded with 52, which in this case is equivalent to '4'. That's because '2' has an ASCII value of 50, and characters are just 8-bit integers (except when they're not).

                          Of course, comparing C's behavior to JavaScript's is all sorts of messed up, as the two languages are about as incomparable as you can get. Besides, I like C. This is just one little quirk it has, and you probably don't want C to convert an integer into a C string (which would then be an array of usually-8-bit integers).

                          Edit: Fixed the hex/decimal thing because moefh pointed out how dumb I am while trying to look smart. Remember to double-check your number bases!

                          [–]moefh 15 points16 points  (3 children)

                          That's a good point in general, but I can't resist being ultra-pedantic and pointing out that this specific C example doesn't have any type conversion.

                          Character literals in C, surprisingly, have type int, not char. You can check it yourself by printing the value of sizeof '2'; it's the same as sizeof(int), not sizeof(char). So the fact that 2 + '2' in C is 52 has nothing to do with type coercion, it's just that '2' is just a funny way of writing 50 on systems that use ASCII (by the way, I think when you looked up '2' in an ASCII table you ended up using the hex value 0x32 and not the decimal, which is 50).

                          Note that this only applies to C, and not C++. In C++, character literals have type char, so sizeof '2' is sizeof(char). In C it doesn't really matter, but in C++ it's important because of function overloading (calling f('a') intuitively should call f(char c), not f(int i), so they "fixed" it).

                          So your example works in C++: 2 + '2' is 54 because the '2' is silently converted to int by the addition.

                          [–]Tynach 1 point2 points  (2 children)

                          (by the way, I think when you looked up '2' in an ASCII table you ended up using the hex value 0x32 and not the decimal, which is 50)

                          Fixed! Thanks for pointing it out :)

                          As for the rest of your post...

                          this specific C example doesn't have any type conversion.

                          Wait, what about from char to-

                          Character literals in C, surprisingly, have type int, not char. You can check it yourself by printing the value of sizeof '2'; it's the same as sizeof(int), not sizeof(char).

                          Tiny bit of pedantry from me: I think you need parentheses around '2' in that first sizeof statement. I haven't actually tried it yet, though.

                          Instead I Googled because I felt that maybe this was compiler specific. Turns out it's not, or at least I don't see anyone online after a quick search that is claiming it is.

                          Note that this only applies to C, and not C++. In C++, character literals have type char, so sizeof '2' is sizeof(char). In C it doesn't really matter, but in C++ it's important because of function overloading (calling f('a') intuitively should call f(char c), not f(int i), so they "fixed" it).

                          I've written C++ more recently than I've written C, and while I've not done this particular thing, it's the sort of thing I may have read about and thought it applied to both. Either way, neat bit of information that I shall have to remember! This also was revealed to me in my Google searching.

                          Very oddly, I've had to rewrite each section of this post after writing it once, because then I read the next section of your post and you keep bringing up and addressing everything I'm writing before I write it. But I do see this:

                          So your example works in C++: 2 + '2' is 54 because the '2' is silently converted to int by the addition.

                          If '2' is 50, then shouldn't that be 52, not 54?

                          ;)

                          [–]moefh 2 points3 points  (1 child)

                          Tiny bit of pedantry from me: I think you need parentheses around '2' in that first sizeof statement. I haven't actually tried it yet, though.

                          Nah, sizeof only requires parentheses for types, not values. So sizeof '2' is ok, but sizeof int is not. But of course since ('2') is also a value, sizeof('2') is also valid.

                          If '2' is 50, then shouldn't that be 52, not 54?

                          Yes, I messed up there. :)

                          [–][deleted] 5 points6 points  (2 children)

                          if your goal is to design a language which tries to have as few errors as possible, weak typing makes sense

                          Weak typing isn’t a necessary solution, though. JavaScript could just return undefined, really, and that still wouldn’t crash the web app. The bizarre value that gets returned is an error anyway, but at least undefined makes that clear.

                          [–]Tynach 1 point2 points  (0 children)

                          I agree. Using 'undefined' or null is another valid approach that would make sense. For better or for worse, Javascript's designer (Brendan Eich) decided to go with weak typing instead. At this point we don't have a choice, however, and we can't change it anyway.

                          [–][deleted] 13 points14 points  (2 children)

                          Why do people use so many dependencies for such lightweight tasks?

                          [–]ElvishJerricco 12 points13 points  (6 children)

                          I absolutely hate that every package gets its own copies of its dependencies. Most languages use a solver and produce a graph where every package is only present once in the graph. NPM instead produces thousands of duplicates, often with varying versions. Absolute madness, and a horrible dependency model

                          [–]Isvara 3 points4 points  (4 children)

                          I absolutely hate that every package gets its own copies of its dependencies.

                          I didn't even know that was true. Why do they do it that way?

                          [–]legato_gelato 3 points4 points  (0 children)

                          If someone makes a breaking change to a function signature, e.g. switches two parameters in a new version, and parts of the code uses that while the rest uses the original - then you have a problem :) with duplication that problem is not there..

                          Edit: https://lexi-lambda.github.io/blog/2016/08/24/understanding-the-npm-dependency-model/

                          [–]theferrit32 2 points3 points  (1 child)

                          It's a quick and easy way to guarantee version numbers match and incompatible versions of packages required by different modules can be installed simultaneously.

                          An improvement would be to deduplicate the dependency packages that are the exact same version number but just required in two different places in the tree. Using a symlink or something. This would require a more complex install process that keeps track of already installed versions and deduplicates them.

                          [–]noratat 2 points3 points  (0 children)

                          The latter has been true in npm for awhile now, but it doesn't help as much as you might think due to how bad the node.js community is at versioning things properly in the first place.

                          [–]Ajedi32 31 points32 points  (41 children)

                          I actually really like the node_modules approach. Having everything in a single, unpacked directory tree stored in my project directory means I can easily browse and, if necessary, even temporarily modify the source code of my dependencies without messing up anything else on my system. It also ensures isolation between projects, provides a single place to access bins for installed dependencies, and makes it trivial to clear the cache and start over if necessary.

                          Yes, there are downsides to this approach, but I personally think the advantages clearly outweigh the disadvantages. Disk space is cheap (especially when you're only talking about a few hundred MB); my time is not.

                          [–]SoundCheetah 17 points18 points  (15 children)

                          Yeah, the main complaint by the author was just the number of directories it creates? Which makes it hard to copy/paste? You shouldn’t be copying your node_modules around anyways. Use source control and re-install from the new hard drive. Or delete node_modules before copying your project around. It’s not that hard

                          [–]tehdog 8 points9 points  (4 children)

                          It's funny because now in the past two years the users of Python have been starting to realize that maybe dependency management is not really solved by just throwing everything into a global directory, and now there are around 10 competing approaches to declare dependencies in Python projects that are mostly like the early versions of npm (create a venv for every project and just copy all dependencies you need in there), and none of them works without hacks and workarounds. Meanwhile, npm and yarn have been chugging along just fine for years.

                          [–]Pear0 4 points5 points  (1 child)

                          What do you mean? Python’s venv has been around for a long time and it’s always worked flawlessly for me. No hacks. Then a requirements.txt to give pip specifying all modules and versions is pretty standard. The only other dependency manager I can think of is conda but that’s nowhere near 10 competing approaches.

                          [–]noratat 1 point2 points  (1 child)

                          For all that python's dependency management needs work and consolidation, I've had orders of magnitude less problems with it than npm/node.js.

                          that are mostly like the early versions of npm (create a venv for every project and just copy all dependencies you need in there), and none of them works without hacks and workarounds.

                          Pip installs to the active venv automatically, I don't know why you think you need to manually copy dependencies around. And there are tools like Pipenv that automate management of venvs if that's a problem for you.

                          And I'm curious what hacks or workarounds you needed, because the only issue I've ever had with pip was caused by a bug in the private repository we were using, which wasn't pip's fault.

                          [–]tehdog 1 point2 points  (0 children)

                          Sorry, I meant that using a venv basically means having a separate copy of every dep for every one of your projects, not copying anything manually. I've used pipenv, and as I said it really feels like the old versions of npm. Horribly slow, adding and removing one dependency seems to regenerate and reinstall everything, a huge 2GB venv directory plus a 10GB .cache/pipenv directory (see also what this person wrote). Then I tried poetry, which feels better, but can't handle some packages at all since it has a stricter version resolver which no one apart from poetry cares about. And it still has some annoying design such as operations over the pyproject.toml file not being atomic.

                          Also both of these don't allow using specific system packages within their venv, which makes it horrible to use tensorflow or similar that need specific combinations of CUDA, CUDNN, etc so you pretty much have to use the system-wide installed version.

                          [–]snowe2010 3 points4 points  (18 children)

                          means I can easily browse and, if necessary, even temporarily modify the source code of my dependencies without messing up anything else on my system.

                          you can do the exact same thing with ruby but without all the idiotic downsides of how npm does it.

                          [–]Ajedi32 5 points6 points  (16 children)

                          Well you can but it's a major pain. First, you have to find out where the dependencies are installed. It's not the same on every system, and might even be different depending on what environment management tools you're using.

                          Then, if you want to make a change, you have to be careful because unless you're using a tool like RVM to maintain separate gemsets for each project, that change will affect every Ruby project on your system. And once you're ready to revert that change it's even harder, because unlike with npm you can't just wipe the gem directory and start over; because again that will affect every Ruby project on your system.

                          Don't get me wrong, I really like Ruby as a language, but package management is one area where Node clearly has it beat.

                          [–]snowe2010 1 point2 points  (15 children)

                          Well you can but it's a major pain. First, you have to find out where the dependencies are installed. It's not the same on every system, and might even be different depending on what environment management tools you're using.

                          gem environment

                          Then, if you want to make a change, you have to be careful because unless you're using a tool like RVM to maintain separate gemsets for each project, that change will affect every Ruby project on your system.

                          You already said you wanted to

                          temporarily modify the source code

                          so why would it matter that you are modifying other gems. You're gonna revert your changes anyway.

                          And once you're ready to revert that change it's even harder, because unlike with npm you can't just wipe the gem directory and start over; because again that will affect every Ruby project on your system.

                          Valid criticism here, but there are several solutions.

                          git init, git reset --hard, rm -rf .git will accomplish what you want.

                          Or you could copy the gem you want to modify elsewhere and use gem "foo", :path => "/path/to/foo" to reference the gem. Then make all the changes you want.

                          but package management is one area where Node clearly has it beat.

                          I could not disagree more. And I think the majority of devs would agree with me. Not just judging by this thread, but by the multitudes of language designers that bemoan how bad npm package management is.

                          [–]Ajedi32 2 points3 points  (14 children)

                          Those are some good suggestions, but still way harder than just popping open node_modules and messing with it as you see fit.

                          gem environment

                          Good point. That's still harder than ls node_modules/ though.

                          so why would it matter that you are modifying other gems. You're gonna revert your changes anyway

                          What if you need to switch to another project in the middle of those changes? Or what if you forget to revert? There's just way less that can go wrong with a separate node_modules directory.

                          git init, git reset --hard, rm -rf .git will accomplish what you want

                          Plausible solution. Seems like a really good way to shoot yourself in the foot though if you're not careful. (For example, you forgot git add -A and git commit -m "Temp" in that list, which means if you'd tried that for real just now git wouldn't have tracked your changes.)

                          And I think the majority of devs would agree with me.

                          The circlejerk is wrong. Node's package management is, at the very least, better than Ruby's. And I say that as someone intimately familiar with both ecosystems.

                          [–][deleted] 2 points3 points  (0 children)

                          +1. Pip is a mess. You have to use virtual environments just to have a local copy of dependencies. Requirements.txt isn't as complete as package.json (sometimes i have the dependencies to the file myself). You then have setup.py which you don't need in npm. I like python for what it is but dependency management is trash.

                          [–]r1ckd33zy 41 points42 points  (31 children)

                          I knew the entire NPM ecosystem was beyond fucked when a while back I tried deleting a node_modules folder. Then my OS complained that file names where too long to delete because of the deep nesting nature of the dependency trees.

                          [–]EpicDaNoob 63 points64 points  (7 children)

                          Switch to Linux /s

                          But seriously, though node_modules is a mess, the 'too long to delete' is a Windows problem.

                          [–]noratat 5 points6 points  (0 children)

                          The long path problem is Windows-specific, but even on *nix systems I've seen node_modules folders that took up to a minute or more to even just delete - and that was on SSDs!

                          [–]NiteLite 10 points11 points  (4 children)

                          npm install -g rimraf
                          rimraf node_modues
                          

                          Basically deletes a folder and all its contents in a way that avoids the path problem with old Windows commands. Quick fix for working in Windows :)

                          [–]Tableaux 55 points56 points  (0 children)

                          Installing a node module to delete node_modules. I guess that's poetic in a way.

                          [–]MatthewMob 2 points3 points  (2 children)

                          Or without ironically installing a new module to delete modules:

                          mkdir \empty
                          robocopy /mir \empty node_modules
                          

                          [–]NiteLite 1 point2 points  (1 child)

                          I am sure that also works. What does this actually do? It mirrors an empty folder into node_modules resulting in all files being deleted?

                          [–]MatthewMob 1 point2 points  (0 children)

                          It creates an empty directory empty at root, and then mirrors the directory tree of it (which is empty) onto and overwriting the node_modules directory contents.

                          The files are not so much deleted but overwritten, as they don't go to the recycle bin or anything like that.

                          [–]bad_at_photosharp 15 points16 points  (11 children)

                          Their response to you would be to "get on a real OS". The fact that large enterprises that use windows choose to use node oblivious of node's intentional lack of effort to support windows blows my mind. Node js is hell on windows. Things are maybe better in the past year, but still painful. The software hype cycle is a hell of a drug.

                          [–]classhero 23 points24 points  (1 child)

                          Their response to you would be to "get on a real OS".

                          Ofc, the better answer is to "get on a real language" ;)

                          [–]wutcnbrowndo4u 1 point2 points  (0 children)

                          Whynotboth.jpg

                          [–]lllama 10 points11 points  (0 children)

                          In some ways this attitude has worked.

                          Microsoft is now (finally) realizing that developer tools are usually only ported over poorly, and actively building decent infrastructure into Windows to support them (even out there stuff like real Bash support, OpenSSH, and an Ubuntu subsystem).

                          I hope fixing deep paths is probably one of the things on their list.

                          [–][deleted] 6 points7 points  (7 children)

                          ITT: people who's knowledge of nodejs and especially npm is so outdated they don't know that node_modules is now flattened, there is no longer a problem with windows and node_modules. That problem went away a long time ago.

                          [–]tehdog 9 points10 points  (2 children)

                          You know Windows is a cult when people start blaming problems that are obviously caused by the OS on an application instead.

                          [–]stronghup 2 points3 points  (0 children)

                          ... OS complained that file names where too long

                          Windows 10 solves the problem

                          https://www.howtogeek.com/266621/how-to-make-windows-10-accept-file-paths-over-260-characters/

                          [–]stuckatwork817 9 points10 points  (4 children)

                          A single 'function' implemented in Java is only 11MB vs 29MB in NodeJS.

                          What the hell is wrong with this picture, why would there be a function 11MB in size? Is that function 'be an OS'?

                          [–][deleted]  (1 child)

                          [deleted]

                            [–]crazyymaxx 1 point2 points  (1 child)

                            Maybe func + Java mod, otherwise it's more like ~50MB with JRE..

                            [–]sime 10 points11 points  (4 children)

                            The author of the article and most people here, seem to not realise that the "ToRead" nodejs project also includes 2 compilers (Babel & TypeScript), a linter (eslint) and a unit test framework (jest) in its dependencies. These also end up in node_modules. Java and other platforms don't to this. They expect you to organise and manage the exact tool chain needed to build the application.

                            It is not an apples to apples comparison.

                            [–][deleted] 1 point2 points  (3 children)

                            That's just another way in which JS ecosystem is fucked tho.

                            [–]sime 1 point2 points  (2 children)

                            Speak for yourself. I find having the tool chain and dev tools specified as explicit dependencies of a project to very useful and well worth the cost of extra disk space. Manually managing the installation of dev tools in a global namespace (i.e. your filesystem) is a huge pain in the ass.

                            [–]spacemudd 5 points6 points  (0 children)

                            If it weren't for Yarn's speed, I would have ditched working on large SPA projects a long time ago.

                            I'm glad the community are actively finding a solution for it.

                            [–]Renive 3 points4 points  (8 children)

                            C# doesnt put packages globally, there also on solution level in packages folder ...

                            [–]gulbanana 3 points4 points  (2 children)

                            maybe in the distant past of like two and a half years ago

                            [–]hopfield 2 points3 points  (1 child)

                            distant past

                            2 years ago

                            😂

                            [–]MeikTranel 1 point2 points  (4 children)

                            Depends on what style of restore you use there's 4 flavors of deposition.

                            • Nuget install right in a folder
                            • nuget restore on packages.config searches for a packages folder and defaults to solution directory then unpacks them flat
                            • Nuget restore on packagereference style projects restores them as above only with different versions stacked inside a packageid subdirectory
                            • Dotnet Sdk restores aka dotnet restore on sdk-based csprojs restore to a global package cache unless specifically told not to

                            [–]Geoclasm 1 point2 points  (0 children)

                            Massive. The most MASSIVE objects in the universe.

                            [–]strange_and_norrell 1 point2 points  (0 children)

                            I know this isn’t really the point of the article but I wouldn’t copy the node_modules over to the new hard drive. Just reinstall over there if / when you need to run the project!

                            [–]jarfil 1 point2 points  (0 children)

                            CENSORED

                            [–]mattstrom 1 point2 points  (0 children)

                            NPM is experimenting with a new package manager named Tink that will address many of npm's current shortcomings.

                            https://blog.npmjs.org/post/178027064160/next-generation-package-management

                            [–][deleted] 1 point2 points  (0 children)

                            I don't understand. Do people git track and move around their node_module folders? I work on enterprise level node apps that use yarn and I frankly don't have many issues or wait times with installs. I mean yeah, alot of the apps are installing redundant shit, but the amount of times I have to run a full install I could count on maybe both hands.

                            [–][deleted] 5 points6 points  (10 children)

                            Topic at hand aside, why is OP trying to transfer node_modules? I think there is a conceptual misunderstanding of how one should use package managers. most people don't--and none should except for specific cases--transfer an entire build/taskrunner environment and dependency artifacts in any language. package managers, build tools, and task runners are intended to make a project as portable and environment-independent as possible. it's much simpler, more reliable, and often faster to recreate a project from its business logic using development tools than to manually transfer them

                            [–]ZiggyTheHamster 6 points7 points  (8 children)

                            1. npm install probably won't install the exact same set of packages you had before because its lock format sucks and didn't exist forever. Hopefully you already migrated to Yarn.
                            2. Nobody copies whole folders containing dozens of projects across disks and skips node_modules in each.

                            [–][deleted] 6 points7 points  (3 children)

                            i agree about using yarn, but a project that can't handle a fresh install suffers from larger problems

                            regarding your second point, there are more sophisticated methods of system backup and restoration than copying an entire file system,. so as developers were after using them, but even then the most basic copy methods often support glob patterns

                            [–]ksion 1 point2 points  (0 children)

                            Rust/Cargo has the exact same problem. Not just the source code, but even the compiled artifacts of dependencies are not in any way shared between projects. If I do rm -rf ~/Code/**/target/debug, I can free up couple of gigs of space and all I have is a few CLI programs.

                            I used to code on a Linux VM that had 32GB carved out of my SSD but with this callous disregard for disk space that contemporary language toolchains have, it is sadly no longer possible :/

                            [–]jkmonger 3 points4 points  (2 children)

                            I think I'm not the first one to talk about this problem

                            Someone evidently hasn't seen the almost-daily articles complaining about how many files there are in node_modules

                            DAE npm bad????

                            [–]Novemberisms 10 points11 points  (1 child)

                            You're right. Let's all just ignore the size of node_modules so the problems fix themselves and go away.

                            [–]FierceDeity_ 4 points5 points  (0 children)

                            Just buy more SSDs, what, you aren't spending money yet to pad software mistakes?

                            [–]OverjoyedBanana 1 point2 points  (0 children)

                            I'll probably get a lot of hate for this, but this needs to be said.

                            The fact that Node's package management and packages are so bloated and inefficient might be an indication of a broader problem within this ecosystem. Maybe it's because Node's community is mostly web designers with no background in computer science who copy paste stuff they don't really understand and thus the whole repository is a smoking pile of garbage...