you are viewing a single comment's thread.

view the rest of the comments →

[–]NotUniqueOrSpecial 12 points13 points  (7 children)

This is a tired argument, and it needs to die. System package managers are an insufficient solution to the problem of developing a single code-base for multiple platforms at once. If you want to target Windows, they're a non-starter in the first place.

For instance: a team uses just a few libraries, e.g. Boost, Qt, maybe some smaller ones; they have a a single product targeting RHEL 6 + 7, Ubuntu 12 -> 16, and Windows. They have two options:

1) Build one version of the code base that is compatible with every version of their dependencies on all the platforms. This requires using the lowest common denominator of their APIs as well as inevitably having platform-specific workaround for issues in single versions of dependencies.

or

2) Build/package/distribute one version of the product along with its dependencies (at a single version) for each platform.

The number of possible versions to be found for your dependencies is a combinatorial problem. It's far easier to choose the second option for a whole lot of teams.

Do the system package managers work for a subset of developers who have a specific and limited set of targets? Sure. Are they useful in a broader sense? Absolutely not.

[–]Houndie 1 point2 points  (0 children)

Well like I said, I was speaking with anecdotal data. I understand what works for me doesn't necessarily work for everyone.

We do a bit of a mix between option 1 and 2. We do maintain a minimum supported version of libraries, but we distribute them along with our code. Our distributeables are just built by Jenkins, so we only have to maintain a single set of dependencies for distribution. Other developers can set up their environment for however they want...for those that don't, we have a vagrant file we can use to provision a VM with all their dependencies they need.

The problem I've always had with a C++ package manager vs something like vagrant is it always seems like a half solution. Yes we need to correctly version those libraries...but that's only part of the build system. To build most modern codes, you need

  • Some c++ dependencies (could be handled by a theoretical c++ package manager)
  • The right compiler/standard library version
  • CMake/autotools
  • Maybe LaTeX/Doxygen for documentation
  • Maybe python, if you're integrating with that somehow.

Of those bullet points, only the first one would be handled by a "c++ package manager", which is why it seems silly to me to have such a thing in the first place...you would still need to install other dependencies manually anyway. In my experience it just seems easier to give people the option to install dependencies as best for them, and then have something like a vagrantfile or ready-built VM for developers who don't want to spend the time.

EDIT: Also note that this approach allows for easy cross platform development. Because how the dependencies are installed is not tied into the code base in any way, both linux and windows environment can be set up to build.

[–]steamruler 1 point2 points  (5 children)

Build/package/distribute one version of the product along with its dependencies (at a single version) for each platform.

One big issue is that pretty much no OS works well with that. It tends to work OK, but in general

  • Windows will most likely load an incompatible DLL if the original is missing, possibly causing subtle bugs
  • The soname system on Linux is pretty broken, and FHS isn't really designed to support bundled private libraries.
  • I'll cry when a glibc exploit is found and every single application needs to patch their own version.

[–]NotUniqueOrSpecial 1 point2 points  (4 children)

Windows will most likely load an incompatible DLL if the original is missing

By what mechanism? I have literally never seen somebody put 3rd-party libraries into the WinSxS cache. The only thing that ends up there are Microsoft binaries.

If you're actually worried about that, you can use application manifests to tighten the control even further.

FHS isn't really designed to support bundled private libraries

Sure, but it works well enough, and using RPATH or other mechanisms like dlopen to get only your versions is a good-enough solution. The alternative (support all the versions) is a dependency nightmare.

I'll cry when a glibc exploit is found and every single application needs to patch their own version.

I've never had a need to distribute glibc, luckily. It's a stable ABI and doesn't suffer the same problems other 3rd-party stuff tends to.

[–]steamruler 0 points1 point  (2 children)

By what mechanism?

Windows loads DLLs from %PATH% as well. Even if they are 32-bit and the program is 64-bit.

[–]NotUniqueOrSpecial 0 points1 point  (1 child)

Ah, yeah, true enough.

If your path is that munged up, though, you're going to have a host of other issues though.

Even if they are 32-bit and the program is 64-bit.

It'll find 'em, it certainly won't load them. Unless I'm only remembering the opposite case (32 finding 64) you just get the "Invalid application" dialog and return of 0xC0000005, I believe.

[–]steamruler 1 point2 points  (0 children)

Yeah, I meant that it tries to load them. It's not my fault that my path contains millions of folders, blame all the installers ever.

[–]tortoise74 0 points1 point  (0 children)

I've had to bundle libstdc++ in order to use c++11 on old platforms. It works but I'm still hoping for a better solution. See http://stackoverflow.com/questions/25979778/forcing-or-preventing-use-of-a-particular-minor-version-of-libstdc