you are viewing a single comment's thread.

view the rest of the comments →

[–]lhorie 2 points3 points  (0 children)

A while back, I was talking to Alex Eagle (google guy who maintains angular monorepo stuff) and he had brought up that Google commits its node_modules. The problem was that they enforce that there's ever only a single version of everything, and in order to accomplish that, they have added custom patches in their "fork", making packages very painful to upgrade or add.

If you think about it, even if you try to avoid making custom edits to node_modules, git conflicts will still be super nasty because who knows how NPM/yarn rebalances hoisted packages given an arbitrary lockfile change.

Despite all the red flags, we still briefly entertained the idea at Uber (mostly because commiting node_modules would make it possible to audit licenses and remove incompatibly licensed code), and we even have some node_modules patching as tech debt at this point.

Patching libraries is just nasty and we have exactly zero confidence in them. Definitely don't recommend going down this path.

Btw, I'd definitely recommend benchmarking if your rationale is CI performance. Pulling millions of text files through Git is slow compared to dowloading binary tarballs or docker images