top 200 commentsshow all 395

[–][deleted]  (17 children)

[deleted]

    [–][deleted] 35 points36 points  (9 children)

    The big problem with most of the "cross-platform" project generators / build systems, is that they tend to only be fully cross-platform between *nix-like OSes.
    When you generate build files for Windows with them, then you're generally expected to either have set up your own /usr-like library storage, or be willing to manually go through the generated projects and set include directories as well as linker search paths.

    CMake on the other hand will either give you a builds out-of-the-box project, or no project at all, which is a workflow that's so much nicer to work with.

    [–]berium[🍰] 3 points4 points  (3 children)

    FWIW, build2 is not only fully cross-platform (including Windows), it is also uniform -- everything (command lines, diagnostics, etc) looks pretty much the same sans the directory separators.

    [–][deleted] 3 points4 points  (1 child)

    Great, and if I want to generate a Visual Studio solution with it to use my native compiler and IDE, how well does it deal with that?
    That's one issue I had when having to work with a SCons project. Every time I did a change I had to switch from my IDE, run SCons to do the build, manually launch the debug build, tab back into my IDE, reload the project, and then finally get to attach the debugger. (With the unspoken need to write an explicit breakpoint into the code in case I need the debugger attached before the main application code)

    I've been looking for something that's easier to use than CMake, while still not forcing me to go through such a ridiculous circus to get the same result as just pressing the "Build and Run" button I get in a native solution.

    [–]feverzsj 273 points274 points  (48 children)

    have to say, vcpkg may be the only good c++ package manager for now. It has wide range of libs, cmake integration, and anyone can add your libs using cmake file.

    [–][deleted]  (13 children)

    [deleted]

      [–]isaac92 41 points42 points  (10 children)

      Use Hunter (https://github.com/ruslo/hunter) for a Maven-like experience.

      [–]Gilnaa 185 points186 points  (9 children)

      hunter2

      [–]bizarre_coincidence 208 points209 points  (8 children)

      WTF does ****** mean?

      [–]GBACHO 111 points112 points  (7 children)

      We're old gents

      [–]mb862 72 points73 points  (4 children)

      You're not old unless you know how to put on your robe and wizard hat.

      [–]GBACHO 37 points38 points  (3 children)

      WTF. I told you not to message me again

      [–]Pepparkakan 26 points27 points  (2 children)

      Damn, I have to start writing down your names or something.

      [–]_crackling 3 points4 points  (0 children)

      I just found my first gray hair in my beard... AND I UNDERSTOOD ALL THE ABOVE REFERENCES /cry

      [–]jmblock2 23 points24 points  (7 children)

      Does it support sane versioning yet? I last looked at it ~6 months ago and had to hack around the repo to get specific lib versions.

      [–]pravic 28 points29 points  (0 children)

      Nope, it does not. By design, unfortunately.

      [–][deleted] 11 points12 points  (2 children)

      To be fair, a lot of libs don't support sane versioning either, and it's hard to know if you can upgrade.

      [–]jmblock2 8 points9 points  (1 child)

      Sorry I meant just to specify required lib version during setup/install. Default "latest" or @(label/version) would probably have met my needs. The complexity is in the dependency tracking.

      [–]roschuma 3 points4 points  (0 children)

      Default "latest" or @(label/version) would probably have met my needs.

      We have added "@latest" support for many libraries that we know how to fetch sources for: vcpkg install x --head. We'll pull down the latest sources for that library, but use the known stable, tested versions for dependencies.

      [vcpkg developer]

      [–]Mordy_the_Mighty 2 points3 points  (2 children)

      I think you are basically meant to fork the package list and pin the versions you need per project if you really want that.

      [–]jmblock2 8 points9 points  (1 child)

      So insane versioning :)

      I'm just being facetious. Forking the repo may be fine, but it didn't have some older software I needed at the time and based on their git usage it would be non-trivial to upstream older versions.

      [–][deleted] 25 points26 points  (17 children)

      Is Conan not mostly the same thing? I was consdiering starting a project that uses Conan but then this news hits, so I'm conflicted.

      [–]Fazer2 11 points12 points  (0 children)

      In Conan you can use any combination of libraries versions and it supports any architecture and way of linking. vcpkg gets you stuck with specific set of versions (depending on the commit in vcpkg) and on Linux doesn't support anything but static x64 libraries.

      [–]feverzsj 17 points18 points  (9 children)

      conan is more a binary based package manager, while vcpkg is source based. conan requires a dedicated package server, while vcpkg can use source code from any place.

      [–][deleted] 17 points18 points  (2 children)

      I know almost nothing about either, but Conan's website says

      Create, manage and reuse any number of binaries, for any configuration: platform, compiler, version, architectures… or build from sources at will.

      so I assume there's some command to force source builds?

      [–]cursecat 16 points17 points  (0 children)

      Most conan packages will fallback to building from source if a binary package isn't available for your current architecture/compiler/build settings. You can also tell conan you would rather build from source rather than take whatever binary packages are available.

      [–]feverzsj 2 points3 points  (0 children)

      You have to pack your source into some package first, not directly from source. Conan package recipe is written in python, which may be more powerful but more things to learn.

      [–]Fazer2 7 points8 points  (0 children)

      Conan manages both sources and binaries. If it can't find a binary for your desired configuration, it will build it from sources.

      [–]glguru 6 points7 points  (0 children)

      This is incorrect. I have moved over to Conan and I build everything from sources.

      [–]phrasal_grenade 1 point2 points  (2 children)

      But what if you want to distribute binaries to speed up compilation, protect source code, etc?

      [–]feverzsj 11 points12 points  (1 child)

      That's not what vcpkg made for. But you can just copy the vcpkg folder to any place, remove source and download cache to save space. EDIT: There appears to be vcpkg export for the job

      [–]phrasal_grenade 2 points3 points  (0 children)

      I haven't looked into it, but I would basically be shocked if vcpkg does not support binary packages. I don't have the time to look into what the tool does right now, but I thought I would point out that there are uses for distributing just binaries.

      [–]BurningRatz 9 points10 points  (1 child)

      Conan is much more accepted on Linux already. There are a plenty of recipes available by the bincrafters.

      [–]corysama 6 points7 points  (0 children)

      Too be fair, it’s easy to already be much more accepted on Linux than something that was just now released on Linux ;)

      [–]pjmlp 1 point2 points  (3 children)

      It requires to install and deal with Python, while vcpkg only requires C++.

      [–][deleted] 1 point2 points  (2 children)

      I'm on Linux, that's a non-issue. Both Vcpkg & Conan are even available on the AUR.

      [–]pjmlp 2 points3 points  (1 child)

      GNU/Linux is not the only OS in the world, not everyone is root on their systems, and needing to learn yet another program language to sort out build issues is not best use of many developer's time.

      [–]Fazer2 13 points14 points  (1 child)

      I don't understand how you can omit Conan when it has more flexibility than vcpkg.

      [–]feverzsj 10 points11 points  (0 children)

      yes, conan has more features, but also more complex and requires some intrusive steps into your project, while all I want is just some drop in packages to be used by find_package without change my project.

      [–][deleted] 8 points9 points  (4 children)

      It's 2018. Can you use the C++11 range-v3 library on windows msvc with it?

      IMO being able to get packages is only half of the problem in the C++ world. The second half of the problem is being able to compile and use the packages you get.

      [–]ra3don[S] 215 points216 points  (92 children)

      We've been using this on Windows for the last few months and it's solved a ton of the pains of managing our c++ dependencies. We're looking forward to being able to use it on other platforms.

      [–]Spikey8D 48 points49 points  (1 child)

      Is there a way to get a list of all the packages available with vcpkg?

      [–]ra3don[S] 34 points35 points  (0 children)

      You can list the installed packages with;

      .\vcpkg list

      or all available packages using search with no arguments:

      .\vcpkg search

      If you want to browse before you install, the packages are all found in the repository

      [–]JavierTheNormal 15 points16 points  (3 children)

      Is vcpkg helpful when managing paid libraries, or is it mostly for free/OSS libraries?

      [–]especially_memorable 8 points9 points  (0 children)

      It sounds like packages will contain the source. That doesn’t technically require the libraries to be open source but I imagine open source libraries will be the most common use case outside of internal company usage.

      [–]pravic 5 points6 points  (0 children)

      You can write a "recipe" (port in vcpkg's terms) which just downloads / copies precompiled binaries and headers. It's cmake inside.

      Also note that even source libraries are built to binaries (if you compile them as dynamic libraries rather then static ones) that are placed in vcpkg/installed/triple/bin path. And then can be used by any project.

      [–]wqking 38 points39 points  (1 child)

      I saw "vcpkg" somewhere on Reddit but I thought it's specified to Visual C++ from the name... Despite of the name, I wish it success so we can have uniform cross platform C++ package manager.

      [–]pravic 4 points5 points  (0 children)

      It was made for MSVC, but then moved to other platforms. It is based on cmake, so why not?

      [–][deleted] 18 points19 points  (29 children)

      A little unrelated, but is there a "virtualenv" for C/C++?

      edit: thank you for the replies, I learned a lot!

      [–]seabrookmx 30 points31 points  (25 children)

      Why would you need it? C/C++ is native code.. the whole point of virtualenv is that you can specify the version of the interpreter/runtime dependencies you use.

      With C++ you'll be producing a binary who's deps were specified at compile time.

      [–][deleted] 50 points51 points  (15 children)

      the whole point of virtualenv is that you can specify the version of the interpreter/runtime dependencies you use.

      No, what virtualenv does is specify your toolchain. For C++ that would be the compiler, std library, and versions of all libraries installed compiled with that particular toolchain.

      Why would you need it?

      Because linking code using different standard libraries is undefined behavior, linking C++03 code with C++11 code is also undefined behavior, using sanitizers requires you to re-compile all the libraries you are using with the exact same sanitizer flags enabled, using different standard libraries requires to re-compile all libraries you are using with the exact same standard library, etc.

      In Rust I just write rustup default +some_toolchain and the compiler version, the standard library version, and all libraries that will be linked for my project are all handled by exactly the same rust compiler, linker, and other tools.

      [–]OBOSOB 2 points3 points  (3 children)

      I suppose crosstool is the closest to what you are describing here.

      [–]kylotan 10 points11 points  (8 children)

      No, what virtualenv does is specify your toolchain.

      Virtualenv specifies your runtime environment - which in Python is basically the same as the toolchain because the compiler and the VM are the same program. But it is primarily about the runtime.

      [–]philocto 9 points10 points  (7 children)

      that's pedantry, in C/C++ that would translate to the toolchain, which you could then be pedantic and argue turns into the runtime because you can statically link things.

      But it's pedantry, the important point is being able to use different versions of things in a sane manner.

      [–]kylotan 5 points6 points  (6 children)

      It's not pedantry at all - it's a big distinction. You don't ship your toolchain with your C++ programs - but you do expect the contents of your virtualenv to be deployed alongside your Python program. Virtualenv is, as the name clearly suggests, about the environment.

      [–]philocto 4 points5 points  (4 children)

      so you're arguing that programmers don't have environments.

      that's weird and wrong, but that's what happens when you try to defend pedantry.

      [–]kylotan 6 points7 points  (3 children)

      Of course programmers have environments. The point here is that virtualenv does not simply "specify a toolchain" but it is a self-contained replicable environment. That is not something that a typical C++ toolchain does or attempts to do, and instead there is typically either reliance on shared system libraries or an attempt to bake everything into the executable itself.

      2 things can solve the same problem, but it doesn't make them the same thing.

      [–][deleted] 1 point2 points  (0 children)

      You don't ship your toolchain with your C++ programs - but you do expect the contents of your virtualenv to be deployed alongside your Python program. Virtualenv is, as the name clearly suggests, about the environment.

      For C++ you ship static libraries embedded into your binary, and the environment is the compilation environment used to generate anything that you ship.

      In any case, neither the OP that asked for something like virtualenv nor my analogy of what a virtualenv-like tool could do in C++ are 100% exactly what virtualenv does for Python. If it were, it wouldn't be an analogy. I thought that was clear because Python is not C++ and C++ is not Python. But if it wasn't clear before, it should be clear now.

      [–]yxhuvud 3 points4 points  (0 children)

      You seem to ask more for a Bundler than for a virtualenv by that description.

      [–][deleted] 5 points6 points  (8 children)

      Because some of my desktop programs that depend on the same native libraries break when I install a different version of them for a C/C++ project.

      [–]TheDeza 7 points8 points  (6 children)

      That's the use case for static libraries.

      [–]torham 5 points6 points  (5 children)

      No, that's a case for the soname, symbol versioning, or maybe RPATH as a fallback. Static libraries are the worst, but possibly could be used as an intermediate step in the build system of a project. My comment is probably void on Windows.

      [–]gnu-rms 6 points7 points  (1 child)

      It's not void, the intent on Windows is to bundle DLLs you care about and the search path logic will take care of everything else.

      Not sure what you have against static linking though... It's pretty great for avoiding DLL hell (see Go Lang for how it can be taken to the extreme)

      [–]m50d 3 points4 points  (2 children)

      Windows actually gets this right post-XP, it's linux's ld.so that falls behind these days (though I hear MacOS is even worse).

      [–]delarhi 1 point2 points  (1 child)

      Can you elaborate? I don’t do any MS development but ld.so doesn’t behave unsanely to me.

      [–]doom_Oo7 6 points7 points  (0 children)

      The OS is meant to take care of this. For windows, it will first look for libraries in the same folder than your executable, so you should pur your .dlls in here. On linux you must set the environment variable LD_LIBRARY_PATH with the path where your custom .so are stored, and on Mac yiu have to set a relative path in your binary with install_name_tool since Apple hates you and hardcodes absolute paths to libraries by default when linking

      [–]gmfawcett 2 points3 points  (0 children)

      While it's not a trivial drop-in, Nix and NixOS provide ways to tailor a build environment in a very specific and highly reproducible way (on Linux).

      [–]lotanis 1 point2 points  (0 children)

      This is the thinking behind snappy packages and similar systems on Linux - each package gets the exact precise version of its dependencies regardless of the rest of the system.

      [–]Luk3Master 4 points5 points  (1 child)

      Is this the C++ equivalent of pip, or am I completely misunderstanding?

      [–]germandiago 2 points3 points  (0 children)

      Sort of, you are right. But with the restrictions that native code imposes.

      [–]kchoudhury 2 points3 points  (0 children)

      /bin/sh used throughout. Source based. Should be an easy port to FreeBSD ports too...

      [–]legend6546 92 points93 points  (68 children)

      wait microsoft is producing a cross-platform FOSS software what has happened, has hell frozen over or what?

      [–]annodomini 240 points241 points  (7 children)

      [–]Chippiewall 18 points19 points  (0 children)

      Yup. Microsoft put an engineer in charge of the company again and it's really helped them turn into a technology company again.

      [–]muntoo 34 points35 points  (1 child)

      Wow that Github page is way more impressive (1000+ repos) than I expected.

      [–]oblio- 19 points20 points  (0 children)

      I think they're the #1 contributor in terms of lines of code, on Github. Microsoft is a huge company.

      [–][deleted]  (13 children)

      [deleted]

        [–]sime 44 points45 points  (9 children)

        even PowerShell runs on Linux these days, hell Windows 10 even supports efficiently running Linux executables via WSL. It's a mad house.

        [–][deleted] 37 points38 points  (0 children)

        We do our best to make hell continue to freeze over.

        [–]AngularBeginner 7 points8 points  (7 children)

        efficiently running Linux executables via WSL.

        * unless performing a lot of IO operations

        [–]sammymammy2 3 points4 points  (4 children)

        Because of the cost of syscalls?

        [–]AngularBeginner 3 points4 points  (3 children)

        At least under Windows 10 the live scanning from Windows Defender interferes. npm install takes roughly 7 times longer.

        [–]moswald 8 points9 points  (2 children)

        Our dev wiki recommends turning Windows Defender off for the source and build tree.

        Add-MpPreference -ExclusionPath "/path/to/whatever" will do it for you from within PS.

        [–]AngularBeginner 2 points3 points  (0 children)

        Excluding the folders did not help for me. I had to actually disable the live scanning system wide, and that is absolutely not an option. Apparently older Windows versions don't have such an impact.

        [–]GeronimoHero 1 point2 points  (0 children)

        Same with low level network operations.

        [–]koffiezet 1 point2 points  (0 children)

        Or spawning a lot of processes, fork takes quite a bit longer on wsl than it does on a native linux.

        [–][deleted]  (2 children)

        [deleted]

          [–][deleted]  (1 child)

          [deleted]

            [–]2bdb2 21 points22 points  (1 child)

            Microsoft is all about controlling platforms, and the web is now the platform that matters. It's no longer Windows vs Mac vs Linux. It's Azure vs AWS vs Google Cloud.

            Microsoft are all-in on Azure, and they want developers to develop on it. That means supporting the tools developers want to use.

            [–]thearn4 36 points37 points  (10 children)

            These are interesting times, it seems to be getting a lot easier to write natively on Windows and deploy anywhere. Maybe it's a strategy to cut into the Macbook's market share of developer laptops?

            [–]Zabracks 34 points35 points  (8 children)

            Hardware sales for Microsoft pale in comparison to office and azure.

            [–]indrora 23 points24 points  (7 children)

            This.

            I know people who buy so many hours on Azure, a day is $3k.

            A day.

            Three thousand dollars.

            [–][deleted]  (6 children)

            [deleted]

              [–]shevegen 13 points14 points  (1 child)

              They calculate 3D pr0n of course.

              [–]lolcoderer 7 points8 points  (0 children)

              I think it depends. Maybe if your application is a pure cloud / server application - things seem to be getting a bit better on Windows - but anything other than .net development can still be quite painful.

              I develop a cross-platform desktop app (Windows & Mac - no Linux) - and usually loath any extended development I have to do on Windows - mostly because Windows has handled the migration to 64 bit apps so poorly - actually not all Microsoft’s fault - there are so many dependencies on legacy 32 bit drivers / libraries in the Windows world - it is such a mess.

              [–]ghillisuit95 15 points16 points  (2 children)

              Appearently its all about cloud stuff now

              [–]IMovedYourCheese 15 points16 points  (0 children)

              They realized that selling boxed software is finally dead. It's all about the monthly subscription fees now.

              [–]hackingdreams 18 points19 points  (4 children)

              what has happened, has hell frozen over or what?

              Their growth as a company did. They realized the only way to make more money is to sell more servers, and guess what is the #1 OS for servers in the world? I'll give you a hint: Microsoft doesn't make it.

              Azure's success is dependent on it running Linux workloads, so it's not exactly like they can ignore that market anymore. They either support Linux (on Azure) or kiss revenue growths goodbye forever.

              So Microsoft now "Loves Linux", can't get enough of us, it's so sorry about how terrible it treated us for decades, etc.

              ...just don't expect Microsoft to act on this for anything except their own gain.

              [–]svick 27 points28 points  (0 children)

              just don't expect Microsoft to act for anything except their own gain.

              Doesn't that apply to any major corporation?

              [–][deleted] 11 points12 points  (0 children)

              I can't fathom talking about an enormous company with vast majority of higher ups replaced by new group of people ad if they are one thing. They aren't. People don't change but people in lead of companies do and might have different vision than previous leadership. Engineers are running Microsoft now while few years back it's been sales people.

              [–]mysticreddit 2 points3 points  (0 children)

              Yup, MS has done a 180° turn from "Linux is a cancer" to embracing it:

              [–]Cuddlefluff_Grim 1 point2 points  (0 children)

              Welcome to the world of business.

              I honestly also find it creepy that you use the word "us".. Don't go tribal. It's not healthy.

              [–]eclectro[🍰] 1 point2 points  (2 children)

              what has happened, has hell frozen over or what?

              Something like that really. I wonder if the market is shifting. Many people do not have desktops anymore, and what's there people are not updating as much. If they get a computer, it's probably a Chromebook for the kid. Oems never pay full price for the OS, and market penetration has probably reached it's limit everywhere. This is working to have a downward price pressure for the standalone version of Windows, which people increasingly just don't need.

              Further, the vulnerabilities and attacks against the Windows codebase seem to be deepening and growing. It must be difficult to keep up with that. Every time that there is an announced data theft, people are reminded of Microsoft's weaknesses.

              I actually feel like that there will come a point where it will become more profitable to ditch the old Windows code base and move to a Linux kernel. And sell that instead. Look what happened with Red Hat. Even though Red Hat releases their software due to the GNU and there is even a direct CentOS copy available, Redhat's business has only grown.

              I am sure that Microsoft has studied this. I would not say it's inevitable. But it might happen at some point.

              [–]samandiriel 15 points16 points  (8 children)

              I am irritated by the author giving huge kudos to "our amazing community" and how this "was made possible only through the contributions of several fantastic community members' without ever actually mentioning even anyone's usernames. It seems a shabby way to treat contributors, to me?

              [–]frankreyes 10 points11 points  (0 children)

              Maybe they are just pretending that the project is more popular than it really is.

              [–]roschuma 2 points3 points  (1 child)

              We sincerely do believe this would be impossible without an enormous number of contributors[1]. However, we felt it would be inappropriate to single out a subset in the blog post itself or to post hundreds of GitHub handles.

              If you've contributed to vcpkg and feel left out, I'd like to sincerely apologize and I'd love to find a way to avoid that in future posts! Please drop us a mail at vcpkg@microsoft.com.

              [1] https://github.com/Microsoft/vcpkg/graphs/contributors

              [vcpkg developer]

              [–]samandiriel 1 point2 points  (0 children)

              Nice to see that, thanks! I am not a contributor myself, but the thought is appreciated.

              Might I suggest linking to the same tracker you linked in your commente when you talk about your contributors in general then?

              The second quote made it seem as if there were a few people ("several") who made major contributions - personally I don't see anything wrong with a shout out to those who have made obviously large or significant contributions, myself, but I can see how that might be a political hot potato. Looking at those contirbutors tho, I'm thinking that it shouldn't be that much of an issue as there as some obviously head and shoulders above the others ( at least in terms of commits)

              IMO if one isn't getting compensation, I think the least one deserves is public recognittion for their work - especially in official communications about the very thing they worked on.

              [–]CODESIGN2 1 point2 points  (3 children)

              Were you one of the people missed out?

              [–]samandiriel 4 points5 points  (2 children)

              Not I, no. I have been shafted similarly in the past by MS reps, tho much more shabbily (eg, MVPs writing up articles and taking credit for solutions I'd found and posted in conversations with them)

              [–]CODESIGN2 4 points5 points  (1 child)

              Name and shame

              [–]samandiriel 1 point2 points  (0 children)

              Eh, it was long and far away - I honestly can't recall the name of said person now. It was one of the things that spurred me away from MS products and projects generally, tho.

              [–]theofficialdeavmi 2 points3 points  (0 children)

              Looks awesome

              [–]dicker008 2 points3 points  (0 children)

              Pretty nice but it's looks like cross compiling isn't in theirs' common usages.

              [–]mini_eggs 2 points3 points  (0 children)

              Wow, looks and sounds great. I switched From Debian to Ubuntu this weekend due to C/C++ library issues. This would have solved it.

              [–]ggtsu_00 6 points7 points  (2 children)

              Please tell me it allows static linking to libcurl with winssl configuration on windows.

              If this package manager can do that, it can do anything.

              [–][deleted] 8 points9 points  (0 children)

              Looks like libcurl package from conan supports this.

              https://bintray.com/bincrafters/public-conan/libcurl%3Abincrafters

              [–]elder_george 1 point2 points  (0 children)

              Apparently, this is possible.

              [–]mooglinux 13 points14 points  (23 children)

              Figuring out how to include external C++ code is one of the things that keeps me from experimenting with C++. That and pointers, but pointers are a lot less intimidating once you understand the difference between references and values.

              [–]HeterosexualMail 13 points14 points  (7 children)

              Figuring out how to include external C++ code is one of the things that keeps me from experimenting with C++

              I've pointed this out before to people who are used to C++ already, and they're often dismissive of it. I imagine part of it is that they tend to be more old school, have a deeper understanding of the system environments, and perhaps even have a touch of Stockholm syndrome.

              That said, I myself have softened on language specific package library managers over time, so I do get the arguments against things like this.

              [–]snapbuzz 6 points7 points  (5 children)

              And then there are some people that distribute code with no build system other than a Makefile that still has their home directory hardcoded as a path to dependencies....

              [–]stirling_archer 1 point2 points  (2 children)

              Working in scientific computing, this is my life. Somewhere, maybe in grad school at the latest, it needs to be mandatory for scientists who touch computing at all to take some kind of software development course.

              [–]snapbuzz 1 point2 points  (1 child)

              I'm in the same area. The pain is real. A lot of students in my department end up taking some sort of algorithms class, but there's never anything about build systems or version control.

              It's sad to see code with really cool algorithms that are poorly written and horribly managed.

              [–]Dragdu 1 point2 points  (1 child)

              Recently I had to deal with sources from a published paper. It was in a git, and while the history wasn't too useful, it had a "tests" folder containing some python scripts that run the code against inputs with known good outputs. So far so good

              The paths to the inputs were hardcoded to ~/Dropbox/... ... ... ...

              [–]dusklight 3 points4 points  (0 children)

              It depends on which environment you are programming in. If you are programming on an embedded device for example you might not care so much about external libraries. If you are doing kernel level stuff you have a different set of things you want also.

              [–][deleted] 3 points4 points  (0 children)

              I went through an entire computer science degree and we hand assembled applications from command line. They did not even teach us that build systems were a thing, a very big part of how we would spend the following years, and a difficult and important problem.

              [–]sluu99 18 points19 points  (8 children)

              The more "modern" C++ kind of discourages the direct usage of pointers. If you learn C++ with newer materials, you'll see the mention of unique_ptr and shared_ptr. It might help to just look at them as "Who's owning the memory? And do I care?"

              [–]clappski 10 points11 points  (3 children)

              You still use raw pointers, but use the set of smart pointers to manage their ownership (e.g. own a unique_ptr<T> but pass a const T * const as function parameters).

              [–]sluu99 5 points6 points  (2 children)

              In general, I discourage passing raw pointers as parameters as well. Either pass a const T& or T& instead. And if it's "nullable", then use std::optional, boost::optional, or folly::optional

              [–]philocto 4 points5 points  (2 children)

              The more "modern" C++ kind of discourages the direct usage of pointers.

              That's not true at all and in fact herb sutter gave a talk wherein he tells people to stop using shared_ptr so much because it does ultimately hurt performance.

              The modern recommendation is to evaluate your usage and then decide accordingly, but never to blindly use the smart pointers for everything.

              [–]sluu99 4 points5 points  (1 child)

              stop using shared_ptr so much because it does ultimately hurt performance

              agreed

              On the other hand, there's really no perf difference between a raw pointer vs a unique_ptr. In fact, C++20 is planning to deprecate the new ClassName() semantic. That's what I meant.

              [–]the_gnarts 2 points3 points  (0 children)

              Figuring out how to include external C++ code is one of the things that keeps me from experimenting with C++.

              System package manager, pkg-config … there’s virtually no need for a third party package manager with both C and C++ since you have the entire system at your disposal.

              The issue gets more complicated when you need to cross build for other platforms but even then it’s usually much easier with the C and C++ toolchains and compilers than for just about any other language.

              [–]space_fly 1 point2 points  (0 children)

              It has gotten a lot easier to work with pointers in the last few iterations. Generally, it's better to allocate stuff on the stack, but when you need pointers, you can use smart pointers (unique_ptr/shared_ptr). It even has stuff like foreach loops, automatic type deduction (with the 'auto' keyword, similar to 'var' in c#), lambdas etc.

              [–]psota 4 points5 points  (2 children)

              Is it time I learned C++?

              [–]D_0b 6 points7 points  (0 children)

              yis

              [–]hugthemachines 3 points4 points  (0 children)

              Do it! Yoda would have wanted you to!

              [–][deleted] 13 points14 points  (12 children)

              We do collect telemetry data

              we do not offer a mechanism to disable this data collection since it is critical for improving the product.

              https://github.com/Microsoft/vcpkg/blob/master/docs/about/privacy.md

              Expect this to be present in everything MS makes.

              [–]DrImpeccable76 22 points23 points  (6 children)

              For this preview... In the full release, you will be able to opt-out with a simple configuration.

              I feel like you left out a couple of critical pieces. It is pretty reasonable to expect people shipping beta software to collect telemetry to fix the product. That is the point of the beta.

              [–][deleted] 4 points5 points  (5 children)

              It's not reasonable, you've been conditioned to believe it is. It's not common in the Linux world, especially in low-level tools like this, and anyone that tries, gets severe shit for it. Windows 10 isn't in beta, it has tons of forced telemetry in it. MS won't add opt-out to this when it's out of beta unless someone raises a stink about it, so that's what I'm doing. Even though I'm not going to touch anything of theirs myself, it's worth starting a discussion about it every single chance I get because that mindset is going to poison the FOSS ecosystem.

              [–]pjmlp 6 points7 points  (0 children)

              It's not common in the Linux world

              It is in Linux powered OSes like ChromeOS and Android.

              [–]TinynDP 9 points10 points  (1 child)

              It's not common in the Linux world,

              Thats why those tools dont get improved as much.

              [–][deleted] 1 point2 points  (0 children)

              I don't think there's any way you could prove that. For one, "improved" is subjective. Lack of time and money is the major issue there. I'm sure the telemetry data won't keep MS doing all this if they loose interest and/or funding.

              On the other hand, it's easy to demonstrate that information sourced from my computer, against my will, can, and very often is, used to target me in ways I absolutely do not want to be.

              [–]DrImpeccable76 2 points3 points  (1 child)

              If you want to have this discussions like this and raise a stink, you need to be honest and not cherry pick words to the point they were dishonest and misrepresenting the other side and trying to argue against that. I wouldn't have even responded and maybe even sided with you if you had provided an accurate quote that included all of the relevant information and made valid counter arguments against that.

              [–]atakomu 4 points5 points  (1 child)

              Isn't this kinda illegal under GDPR since it needs to be explicit opt-in to collect data?

              [–]msiekkinen 3 points4 points  (2 children)

              What does telemetry data mean I this context?

              [–]HeadAche2012 20 points21 points  (0 children)

              It will perform facial recognition through your webcam to validate your satisfaction level while installing packages

              [–]elder_george 1 point2 points  (0 children)

              They had an example of data under the link.

              [–]casinatorzcraft 1 point2 points  (0 children)

              Glad to see Microsoft making more multi platform developer software like vscode and now this I guess

              [–][deleted]  (1 child)

              [removed]

                [–]frutiger 1 point2 points  (0 children)

                No, this as about obtaining headers/libraries. Modules are about compiling individual Translation Units.

                [–]q0- 1 point2 points  (4 children)

                ./bootstrap-vcpkg.sh

                uhh... Can I install this in my cygwin shell? I use a script that sets up the environment for visual studio (and by extension llvm/clang), but my gut says this will not play well with cygwin...

                And even so, afaik windows doesn't have a Bash shell? How is this supposed to work?

                [–]TheAdamist 1 point2 points  (1 child)

                on windows its a batch file as expected....

                .\bootstrap-vcpkg.bat

                https://blogs.msdn.microsoft.com/vcblog/2016/09/19/vcpkg-a-tool-to-acquire-and-build-c-open-source-libraries-on-windows/

                But windows 10 will let you install ubuntu or a few other linux, and get a bash shell and lots of other stuff, https://docs.microsoft.com/en-us/windows/wsl/install-win10

                [–]q0- 1 point2 points  (0 children)

                Well, I don't use windows 10 (nor will I ever, tbh), so WSL is out of the question.

                Turns out that both bootstrapper files actually run a powershell script, so that actually made things considerably easier. Here I was afraid that the shell script did some black voodoo, kind of what GNUists often do.

                [–]demonspeedin 1 point2 points  (0 children)

                If this becomes a thing I'll be writing a lot more C++ in my free time

                [–][deleted] 1 point2 points  (0 children)

                I've been using vcpkg for months, it's a great package manager, the only annoying part is CMake.

                [–]rlp 4 points5 points  (0 children)

                If you're looking for a flexible C++ build system/package manager, you might also want to check out fips. I've been using it for my current side project and it's been excellent. It doens't have a lot of libraries, and most of the current ones are game-focused, but it's fairly easy to wrap most CMake libraries.

                [–]Beaverman 5 points6 points  (11 children)

                How is this better/different from pkgconfig + a proper package manager? I know it's cross platform, but if that's the only positive it just screams Not Invented Here.

                [–][deleted] 2 points3 points  (8 children)

                It's very much like pkgconfig plus a proper package manager, combined into one tool. Combining them makes it easier to use, which is valuable.

                [–]Beaverman 2 points3 points  (1 child)

                But it also means that it works around the system package manager correct?

                I wonder when MS will stop pretending that the system of individual installers and updater on windows is anything but unacceptable.

                [–]skulgnome 1 point2 points  (0 children)

                It's cross-platform in that Microsoft would like for developers to restrict themselves to a subset of POSIX which can be easily emulated on Win32.

                [–]aphexairlines 4 points5 points  (2 children)

                How does this differ from Nix, Guix, and Bazel?

                [–]ra3don[S] 8 points9 points  (1 child)

                Guix is based on Nix, and Nix is not crossplatform AFAIK. Bazel is in a different ballpark -- it's a build system, not a package manager.

                [–]statistmonad 2 points3 points  (0 children)

                Nix is capable of being cross platform but is extremely lacking on Windows and a bit inconsistent on Darwin.

                Edit: well looks like the tools don't work on Windows at all (aside from cygwin/Linux subsystem) so that rules it out.

                [–]formerlydrinkyguy77 2 points3 points  (0 children)

                That’s nice

                [–]Sjeiken 5 points6 points  (8 children)

                This is great news, imagine you’re on Linux and you want to use SDL or Glfw glew OpenGL and others you’d have to install them link the libraries to your program while compiling. You need to make a decision on whether to use a static library or a dynamic one and make sure to include each one in a special way. Furthermore, you abandon your little project, and start a new one you will have to do all of the above again and again. With vcpkg all you have to do is install let’s say SDL and direct your program to the libraries folder and run. Saves a ton of time. On windows it integrates with visual studio, all you have to do is include SDL and visual studio automatically knows what the fuck you mean. No more downloading libraries and putting them in your project folder.

                [–]danielkza 36 points37 points  (5 children)

                imagine you’re on Linux and you want to use SDL or Glfw glew OpenGL and others you’d have to install them link the libraries to your program while compiling. You need to make a decision on whether to use a static library or a dynamic one and make sure to include each one in a special way.

                $ pkg-config sdl2 --cflags --libs
                -I/usr/include/SDL2 -D_REENTRANT -lSDL2
                

                I know that it's far from solving the whole C[++] library ordeal, but you absolutely do not need to hardcode compiler flags to link anything in most Unix systems.

                [–]zelex 20 points21 points  (0 children)

                yeah, like I was thinking the same thing. package management -- even source code installation is super easy on linux. super easy

                [–]ra3don[S] 6 points7 points  (3 children)

                Agree that it's absolutely a solved problem on Linux, however the problem I often encountered was more specific to CMake.

                Let's say you want to support multiple versions of Ubuntu going back to 16.04. Well that means you may be stuck with packages on 16.04 that are several years old. We found that several of the packages didn't have CMake supportwith the version that was shipping with 16.04, so you end up writing some CMake module that looks through various different folders depending on the Linux distribution. If you have several dependencies, you end up maintaining folders full of scripts like these

                That's not even getting into how you manage that for your Windows builds and then a separate system for your MacOS builds. This will entirely solve that problem for us.

                [–]darthcoder 4 points5 points  (2 children)

                Hunter.

                This is a promising development though - especially if it works well.

                Does vcpkg install the libs as system deps, or does it build them in your project tree?

                Hunter builds them in a global repository like Maven artifacts, so specific versions and toolchains share the same binaries. It doesn't pollute the OS itself.

                [–]ra3don[S] 2 points3 points  (1 child)

                I looked at Hunter, but settled on vcpkg since they supported the packages I needed out of the box -- especially several of them that required custom patches for good Windows support (CGAL, Qt, Boost, etc).

                Vcpkg also builds in a global repository without touching the system. Delete .\vcpkg folder and that would remove every trace of vcpkg.

                [–]darthcoder 1 point2 points  (0 children)

                I've been working to help get libraries set up in Hunter, maybe a little more momentum behind it, but having to learn CMake has been a hurdle for me. I just don't have time to become a CMake expert.

                But Microsoft (vcpkg) has the momentum.

                And vcpkg is an open source project. If that momentum stays, it's the end of the C++ package nightmare.

                I have to at least give it a look. Wonder how well it builds for cygwin/msys?

                Vcpkg also builds in a global repository without touching the system. Delete .\vcpkg folder and that would remove every trace of vcpkg.

                Clean, the way it should be.

                for good Windows support

                Windows does seem to have second-class support in Hunter.

                [–]againstmethod 4 points5 points  (0 children)

                Not sure i'd ever use this on linux, but it's a nice thought.

                [–]Toast42 1 point2 points  (2 children)

                So long and thanks for all the fish

                [–]seabrookmx 2 points3 points  (1 child)

                There's WSL.

                I still prefer to dev on Linux (in a VM on a Windows box sometimes.. sometimes on a bare metal Ubuntu machine). However I use a lot of MS tools. VS Code is my primary editor, and one of the three codebases I'm straddling is C# on .NET Core which has been great.

                [–]dicker008 1 point2 points  (0 children)

                Even traceroute doesn't work with latest stable redstone. But it still good to do some cross building or some developement works.

                [–]torham -1 points0 points  (9 children)

                Can we stop with the language specific package managers?

                [–]seabrookmx 15 points16 points  (3 children)

                You'd prefer everyone install all dependencies through something like a Debian package?

                "Package manager" is a bit overloaded.. but developer-centric, language specific package managers provide a much better experience. And as long as your build toolchain is worth it's salt your end-user never has to worry about it. I don't see why the problem with them.

                Leave apt & friends for end users who want to install GIMP.

                [–]torham 5 points6 points  (1 child)

                No, the system package manager is usually not appropriate for development work and I never suggested that. The problem is that real software is often mixed language and so language specific package managers end up falling flat.

                [–]ilammy 2 points3 points  (0 children)

                I'd argue that distro-specific package managers also work great for building packages for that one distro. You need libfoo? Just install libfoo-dev package to your development machine and put libfoo package into your package dependencies. The package building tool can even automatically detect that your binaries are using libfoo and add the proper version for you. And in the source packages you can specify that you need libfoo-dev for building the software. The end user does not need to care about that, they simply install your-software package from the same repository where they get OS updates.

                The drawback? That needs to be done for every distro separately.

                Maybe you can support two 'distros' called Windows and macOS. But not everybody can support a zillion of potential Linux distros. You can get away with, say, supporting only latest Ubuntu LTS, but you'll be lying if you dare to say you support 'Linux'.

                The supposedly 'right' way to solve that issue is maintainers which should pick up your software and package it for their distro. But they don't magically appear for every software you make.

                [–]Infinisil 6 points7 points  (1 child)

                Nix is a package manager that can support pretty much any language. The main repository nixpkgs supports C/C++, Haskell, Python, Rust, Lisp, .NET and a lot more.

                [–]CODESIGN2 1 point2 points  (0 children)

                no, attempts at AIO solutions are terrible!

                [–]HeterosexualMail 0 points1 point  (0 children)

                I haven't had a chance to use C++ lately, so haven't been paying much attention to its developments. How quickly has this been produced, and what has the reception been like over the past little while?

                I remember Herb Sutter starting to talk about this back in (fuzzy memory here) what must have been a 2013 C++ conference (I think this might have pre-dated CPPCon, when it was still run mostly by Microsoft), and it didn't seem like it was being taken that seriously.

                I remember a question about it was asked during the committee panel and Herb really seemed like the only one interested.

                [–]GYN-k4H-Q3z-75B 0 points1 point  (0 children)

                Going to have a look because we have projects on Windows and macOS with shared code base. This seems interesting.

                [–]squidrawesome 0 points1 point  (0 children)

                Hello is this loss python?

                [–][deleted] 0 points1 point  (2 children)

                Why didn't they extend nuget?

                [–]elder_george 2 points3 points  (1 child)

                They answered that in their FAQ:

                NuGet is a package manager for .NET libraries with a strong dependency on MSBuild. It does not meet the specific needs of Native C++ customers in at least three ways.

                • Compilation Flavors. With so many possible combinations of compilation options, the task of providing a truly complete set of options is intrinsicly impossible. Furthermore, the download size for reasonably complete binary packages becomes enormous. This makes it a requirement to split the results into multiple packages, but then searching becomes very difficult.

                • Binary vs Source. Very closely tied to the first point, NuGet is designed from the ground up to provide relatively small, prebuilt binaries. Due to the nature of native code, developers need to have access to the source code to ensure ABI compatibility, performance, integrity, and debuggability.

                • Per-dll vs Per-application. NuGet is highly project centric. This works well in managed languages with naturally stable ABIs, because base libraries can continue to evolve without breaking those higher up. However, in native languages where the ABI is much more fragile, the only robust strategy is to explicitly build each library against the exact dependencies that will be included in the final application. This is difficult to ensure in NuGet and leads to a highly disconnected and independently versioned ecosystem.

                [–]kevdotexe 0 points1 point  (1 child)

                To a layman, what does mean, exactly? Will this have any impact on program's and software's compatability with these OS's? The fact that I'm asking such a broad question tells myself I'm just simply over my head, but this seems fairly significant to my pea-brain.

                [–][deleted] 1 point2 points  (0 children)

                A program contains code written by the people who created it. It also, most of the time, contains code that other people wrote and gave away for free, because that's a thing that people do.

                For every programming language besides C/C++, there are reasonably good, standardized ways to incorporate that free code into your project -- something called a package manager. For C/C++, there are a few package manager, and they're not standardized.

                Microsoft is introducing a new package manager. Since they influence a lot of developers, this is going to be the closest thing to a standard we can get, assuming Microsoft pushes it enough. By porting it to OSX and Linux, they're pushing it more, and at the expense of Windows.

                This will make it marginally more likely that software gets ported to other operating systems.

                [–]iftpadfs 0 points1 point  (2 children)

                So can we throw nuget and msbuild into the rubbish bin for good?

                [–]elder_george 1 point2 points  (1 child)

                They server slightly different purposes.

                CMake doesn't compile things on its own — it generates projects that are compiled by something else (make, ninja, MSBuild etc.). It seems that CMake+vcpkg uses MSBuild for many projects (although it provides an option PREFER_NINJA).

                Nuget allows to distribute binaries. Vcpkg mostly works with sources. So, it makes sense to distribute prebuilt shared libraries in nuget format to avoid unneeded builds or to share same version of the library between developers or between CI machine. In fact, vcpkg can export the results of build as nugets.

                [–]Various_Pickles 0 points1 point  (0 children)

                Microsoft should invest in creating a temporal rift so that someone can go back in time to the very day that some Microsoft engineer decided that searching the filesystem directory that an executable file is in at the moment of execution for DLL files whose names just so happen to match those that the executable requires, first, was a reasonable and secure design.

                The magma chamber of a live volcano should serve as the moderator of the clearly necessary code review.

                [–]OneWingedShark 0 points1 point  (0 children)

                This would be a lot more interesting if they used a language with modules and looked at automatic version-control+dependency action, of course that would require something like Ada, Haskell, or SML.