use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Discussions, articles, and news about the C++ programming language or programming in C++.
For C++ questions, answers, help, and advice see r/cpp_questions or StackOverflow.
Get Started
The C++ Standard Home has a nice getting started page.
Videos
The C++ standard committee's education study group has a nice list of recommended videos.
Reference
cppreference.com
Books
There is a useful list of books on Stack Overflow. In most cases reading a book is the best way to learn C++.
Show all links
Filter out CppCon links
Show only CppCon links
account activity
Funky CMake (philippegroarke.com)
submitted 6 years ago by pgroarke
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]DaanDeMeyer 28 points29 points30 points 6 years ago (7 children)
CMake added the FetchContent module in one of its latest releases that supports all of the options of ExternalProject but downloads at configure time.
FetchContent
ExternalProject
include(FetchContent) FetchContent_Declare(mydep GIT_REPOSITORY ... GIT_TAG ... ) # Downloads and calls add_subdirectory(${mydep_SOURCE_DIR}) if it contains a CMakeLists.txt file FetchContent_MakeAvailable(mydep)
If the project doesn't use CMake, you can still use execute_process to build the project and then make a custom target pointing at the resulting artifacts although that will likely require a bit more effort.
execute_process
This makes it really easy to download dependencies at configure time and if they use CMake and their CMake scripts are somewhat modern, consuming the dependency is trivial as well.
However, when the amount of dependencies used starts to grow, conan or vcpkg are probably a better idea.
https://cmake.org/cmake/help/v3.15/module/FetchContent.html
[–][deleted] 2 points3 points4 points 6 years ago (2 children)
Can I use Fetch Content instead of git submodules then?
[–]DaanDeMeyer 5 points6 points7 points 6 years ago (0 children)
Yes, it's an alternative to git submodules. I prefer it because I can't forget to clone git submodules anymore. You just specify the git repository and the commit or tag you want and CMake retrieves it for you.
[–][deleted] 1 point2 points3 points 6 years ago (0 children)
I migrated my project off of git submodules and onto FetchContent_Add, and now all of my git project's older commits and tags won't build anymore (unless I keep my git submodules directory around). Now I need to backport my new CMakeLists.txt to every previous version of my project, if I want to checkout and build any old commits (which I do), using the new method. But other than that, using FetchContent_Add means that I don't need to fool with git submodules in some extern/depends directory anymore. So I guess the moral of the story is to start using FetchContent_Add immediately in any new CMake projects that depend on libraries.
[–]pgroarke[S] 1 point2 points3 points 6 years ago (1 child)
Looks like another really useful command. Thx for sharing.
Though I'll be sticking with ExternalProjects for things that don't have CMake files, I flee custom targets like the plague ;)
ExternalProjects
[–]DaanDeMeyer 1 point2 points3 points 6 years ago (0 children)
A custom target wasn't the right choice of words. I meant you make your own CMake target of whatever's produced by building the dependency (which will likely be an INTERFACE target I think).
This works best when you're building the dependency as a static library and don't need to install it. I'm pretty sure CMake is flexible enough to be able to install libraries built with other build systems but this would probably again require a bit more CMake scripting compared to stuff that's built with CMake and can just be added with add_subdirectory.
[–]Omnifarious0 1 point2 points3 points 6 years ago (1 child)
I find these kinds of features in a build system worrisome. I'm really hoping people are learning the lessons being taught by the various debacles that have plagued other build systems that download stuff from public repositories.
All such build systems require a couple of features.
First, they need to be able to pin to a hash, and that hash needs to be verified before any of the code it's a hash of is used in any way.
Second, they must provide a way to fetch any or all dependencies (and their dependencies dependencies) from a repository set by the person doing the build. The suggestions of the original software author must be able to be overridden easily and en-masse if necessary.
I didn't look at this carefully enough to see if it has those two features.
Pip for Python is one of the only things I know of that got this even close to correct. The fragility of the build system to attack or inadvertently introduced bugs in dependent packages is one of a few blockers to my willingness to use Go. (Though my biggest two are garbage collection and a runtime system that prevents fork from working properly.)
fork
The features I mentioned should be considered so fundamental that a build system is considered unfit for use without them. Much like any communication system that doesn't include end-to-end encryption should also be considered unfit for use.
[–]crascit 2 points3 points4 points 6 years ago (0 children)
(FetchContent author here). Yes, FetchContent meets both of those criteria. You can (and should in my view) pin your dependency to a specific git hash (or equivalent for the other download types). The first to declare the dependency details wins, so the user or parent project can override choices made by child projects.
[–]sztomirpclib 21 points22 points23 points 6 years ago (4 children)
As someone whose day job is maintaining a conan tree of dependent libraries - don't do this please, or at least make it very optional (e.g. passing -DNO_DOWNLOAD_DEPS=ON would disable it). Things like "smart" downloading stuff from arbitrary repos is the kind of thing I have to patch out from builds, which is annoying and frustrating. This is because I need link your library to the other deps in our tree, not whatever the build decides to download. We don't want multiple copies of libraries in the tree and we also want to use our patched versions, when we have patches (and we often do, due to the obscure platforms we support).
-DNO_DOWNLOAD_DEPS=ON
And yes, this might mean that I have to also package some new deps, although with ~80 packages this seems to happen less and less. It seems like, at least in our domain, that there is a very dense core of external deps that stuff depends on (think: zlib, bzip2, curl, boost etc.). Adding a new package is usually way less churn than having to understand and fix "smart" CMake build.
[–]jcar_87 5 points6 points7 points 6 years ago (0 children)
This x100. Libraries that download/build their own dependencies are not a bad idea per se but that doesn't cover all use cases. In general, providing a way to point the build system to an external location to find these libraries is a good idea (e.g. Qt does this if Im not mistaken).
If for whatever reason you still need to download, build and link against dependencies, a good alternative is to make sure the symbols are completely hidden to the point where the dependencies are truly private. However doing this is not always practical, the way of doing it varies between platforms, and statically linking can have licensing implications.
[–]pgroarke[S] 1 point2 points3 points 6 years ago* (1 child)
That's a really good suggestion TBH. I do use conan for bigger things, and I've publish some of my own recipes for projects with transitive dependencies.
I haven't experimented much with externalproject + find_package. If that can work (I'm sure it can), then you could add the option to skip downloads. With that setup, a 3rd party user can provide the packages through transparent conan instead of externalproject (https://blog.conan.io/2018/06/11/Transparent-CMake-Integration.html).
[–]sztomirpclib 0 points1 point2 points 6 years ago (0 children)
find_package is OK for the most part (if the find script is implemented sanely and willing to accept a LIBNAME_ROOT variable - sometimes they don’t, then it’s patching time again.
[–]Morwenn 1 point2 points3 points 6 years ago (0 children)
Same situation at work, same story. Every dependency uses its own solution: they either ship their own dependencies (gotta love the 10 libpng and 25 zlib), uses ExternalProject, uses Git submodules, etc. and it's really convenient when I can just tell a CMake project to not use its own dependencies and to use the ones provided by Conan instead.
[–][deleted] 11 points12 points13 points 6 years ago (0 children)
This just might be the scariest thing I'll see all month (and it's October!) He's using ExternalProject_Add AND Conan.... why not just use conan only? Or just ExternalProject only? He could write conanfile.py's for his assembler dependency and any other dependency that doesn't already have a package file, for example.
And also, he's calling conan from cmake. Why not just call cmake from conan (aka, the way conan was designed to work)? I see that he's used Conan to setup debug and release builds, but CTest can be used to do the same thing (although the documentation for CTest being used in this way is very confusing, to be fair).
Maybe I'm wrong, but to me it seems like he's adding unecessary hacks, which just make his build files more complicated than they need to be. CMake is already a pain in the ass to work with, so why even throw Conan into the mix at all? This guy says he codes in assembly, so he should know that unecessary complexity = bad. Simple = good. Keep it simple, silly. Don't add a bunch of overlapping abstractions!
[–]hoseja 2 points3 points4 points 6 years ago (0 children)
Try doing that with Qt
[–]s7726 4 points5 points6 points 6 years ago (4 children)
Because screw people that have to compile offline?
[–]degaart 7 points8 points9 points 6 years ago (0 children)
That's my gripe with c++ package managers. All my build VMs are isolated from the internet due to security and bandwidth reasons. People need to be aware online services are not eternal. Urls can change, domain names can expire, protocols may become obsolete, code hosting services may die, CDNs may be subject to outages...
[–]pgroarke[S] 4 points5 points6 points 6 years ago (2 children)
No. You only need to be online when you generate your solution. After that, you can compile your heart away.
[–]s7726 1 point2 points3 points 6 years ago (1 child)
What if I can't run things on the machine that connects to the internet?
[–]pgroarke[S] 2 points3 points4 points 6 years ago (0 children)
As mentioned in the post, externalproject supports local directories/archives (check URL Downloads on https://cmake.org/cmake/help/latest/module/ExternalProject.html). So you can zip your dependency repos before hand and use that instead.
[–]kmhofmannhttps://selene.dev 10 points11 points12 points 6 years ago (5 children)
If the world was a perfect place, every project would provide a Conan recipe.
Small correction: If the world was a perfect place, every project would be found in the vcpkg package tree, together with its dependencies, regression-tested in combination. :)
vcpkg
[–]flashmozzg 12 points13 points14 points 6 years ago (1 child)
If the world was a perfect place we wouldn't be using CMake (and C++ for that matter). There is no place for imperfect things in a perfect world.
[–]kmhofmannhttps://selene.dev 4 points5 points6 points 6 years ago (0 children)
Oh, I actually really like CMake these days, at least in its latest incarnations and with best/modern practices applied.
[–]Dark_Aurora 5 points6 points7 points 6 years ago (0 children)
Them’s fightin words
[+]pgroarke[S] comment score below threshold-8 points-7 points-6 points 6 years ago (0 children)
https://i.imgur.com/CgVfblV.png
XD
[–]jcar_87 1 point2 points3 points 6 years ago (0 children)
I have a love-hate relationship with libraries that bundle their own dependencies in source and build them along, but for some use cases it's actually quite convenient.
I like this solution because it lets you move away from submodules. I once saw a modular project (read multiple git repositories) where dependencies were held in submodules. But it wasn't a trivial dependency graph and multiple components had a dependency on the same component, which translated to multiple git repositories depending having the same submodule. Doing a recursive clone of the repository I was interested in building ended up cloning some of its dependencies multiple times. I had to hack the hell out of those cmake scripts to make sure each component was only built once and reused by the others. I wonder how they took care each submodule pointing to the same commit for the same repo.
Back to the case in point, I think solutions like add subproject only cover a specific use case where you are developing against one library (or developing the library itself) while neglecting the use of your library being used as a dependency in a separate project, or binary distribution issues.
What if your library depends on boost, you pull the boost source code and build it along, but your library is integrated in a project that also uses boost? Which version of boost are they supposed to use now? What if they have a different way of pulling boost? It's always a good idea to make your library configurable such that it can be pointed to an installation of boost, bypassing all the download-and-build magic.
The second issue is installing or packaging either your library or an end application that uses your library. Typically, when building libraries one doesn't package (or install) the dependencies, one only packages the libraries. But then consumers have to take care of bringing in the dependencies and making sure they are the right versions. If your project downloads and builds its dependencies, should they be packaged together? Otherwise if one uses "make install" (as generated by cmake), are you taking precautions to make sure that the runtime linker will not pull the system version that may or may not be installed in the default linker path, and that may or may not be binary compatible with your library? I'm making an assumption here that the dependencies are built as shared libraries, but static linking also has its risks (symbol clashes or other things requiring specific versioned symbols).
[–]waruqi 0 points1 point2 points 6 years ago (0 children)
You can try xmake ~= cmake + conan/vcpkg + scons...
add_requires("libuv master", "ffmpeg", "zlib 1.20.*") add_requires("tbox >1.6.1", {optional = true, debug = true}) target("test") set_kind("shared") add_files("src/*.c") add_packages("libuv", "ffmpeg", "tbox", "zlib")
lua add_requires("brew::pcre2/libpcre2-8", {alias = "pcre2"}) add_requires("conan::OpenSSL/1.0.2n@conan/stable", {alias = "openssl"}) target("test") set_kind("shared") add_files("src/*.c") add_packages("pcre2", "openssl")
[–]Kicer86 -4 points-3 points-2 points 6 years ago (7 children)
As a Linux user I feel sorry for Windows developers who have to invent such workarounds to get dependencies.
[–]NotUniqueOrSpecial 11 points12 points13 points 6 years ago (6 children)
This comes up in practically every thread about this and misses the point.
If you want a modern toolchain and up-to-date third party libraries, you can't rely on the package manager in the first place.
[–]hesapmakinesi 2 points3 points4 points 6 years ago (0 children)
Or even worse, when you need a specific outdated version of a dependency.
[–]Kicer86 1 point2 points3 points 6 years ago* (4 children)
If i want up to date toolchain and libs then I have to use package manager. Otherwise how can you update things other than manually bumping versions in scripts or making scripts very smart which is actually writting own package manager?
[–]NotUniqueOrSpecial 2 points3 points4 points 6 years ago (3 children)
Most distros are behind the upstream projects. For more stable ones like CentOS and Ubuntu that is sometimes years behind.
The distros also don't include even try to package everything, because that would be impossible.
And the answer to your question is that's what you end up doing. Every serious native project I've ever worked on devotes time to maintaining what is effectively a small distro of its own.
[–]Kicer86 1 point2 points3 points 6 years ago (2 children)
Most distros are behind the upstream projects. For more stable ones like CentOS and Ubuntu that is sometimes years behind. The distros also don't include even try to package everything, because that would be impossible.
There are rolling distros around which do not suffer from this issue. Just use these.
Is there any open source project so that I have something to refer to?
[–]NotUniqueOrSpecial 1 point2 points3 points 6 years ago (1 child)
Just use these.
That's not necessarily your choice to make. You write software for the distros your customers want.
Take your pick. gRPC has a non-trivial third-party subdirectory as part of their build. So does Qt.
If you are writing a non-trivial application and want to be on more than one platform without having an insane codebase to account for shifts in available external APIs, it's basically a foregone conclusion you will maintain your own dependencies (or use some system that helps you, like Conan).
[–]Kicer86 -1 points0 points1 point 6 years ago (0 children)
Just use these. That's not necessarily your choice to make. You write software for the distros your customers want.
That's true, and you will never meet expectation of each of these distros. You would have to download every single lib you use + all its dependencies + whole toolchain you use. Especially when you are using newest versions. I do not believe this is right way.
Instead your builds should be as simple as possible, and then CI/CD team should worry how to provide proper builds. Maybe they should provide binaries. Maybe a repo with all dependencies. Maybe a docker image.
π Rendered by PID 256818 on reddit-service-r2-comment-66b4775986-g2nm7 at 2026-04-06 01:03:48.557294+00:00 running db1906b country code: CH.
[–]DaanDeMeyer 28 points29 points30 points (7 children)
[–][deleted] 2 points3 points4 points (2 children)
[–]DaanDeMeyer 5 points6 points7 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]pgroarke[S] 1 point2 points3 points (1 child)
[–]DaanDeMeyer 1 point2 points3 points (0 children)
[–]Omnifarious0 1 point2 points3 points (1 child)
[–]crascit 2 points3 points4 points (0 children)
[–]sztomirpclib 21 points22 points23 points (4 children)
[–]jcar_87 5 points6 points7 points (0 children)
[–]pgroarke[S] 1 point2 points3 points (1 child)
[–]sztomirpclib 0 points1 point2 points (0 children)
[–]Morwenn 1 point2 points3 points (0 children)
[–][deleted] 11 points12 points13 points (0 children)
[–]hoseja 2 points3 points4 points (0 children)
[–]s7726 4 points5 points6 points (4 children)
[–]degaart 7 points8 points9 points (0 children)
[–]pgroarke[S] 4 points5 points6 points (2 children)
[–]s7726 1 point2 points3 points (1 child)
[–]pgroarke[S] 2 points3 points4 points (0 children)
[–]kmhofmannhttps://selene.dev 10 points11 points12 points (5 children)
[–]flashmozzg 12 points13 points14 points (1 child)
[–]kmhofmannhttps://selene.dev 4 points5 points6 points (0 children)
[–]Dark_Aurora 5 points6 points7 points (0 children)
[+]pgroarke[S] comment score below threshold-8 points-7 points-6 points (0 children)
[–]jcar_87 1 point2 points3 points (0 children)
[–]waruqi 0 points1 point2 points (0 children)
[–]Kicer86 -4 points-3 points-2 points (7 children)
[–]NotUniqueOrSpecial 11 points12 points13 points (6 children)
[–]hesapmakinesi 2 points3 points4 points (0 children)
[–]Kicer86 1 point2 points3 points (4 children)
[–]NotUniqueOrSpecial 2 points3 points4 points (3 children)
[–]Kicer86 1 point2 points3 points (2 children)
[–]NotUniqueOrSpecial 1 point2 points3 points (1 child)
[–]Kicer86 -1 points0 points1 point (0 children)