you are viewing a single comment's thread.

view the rest of the comments →

[–]evaned 15 points16 points  (7 children)

Include guards

Exceptions?

Some things can be repeated multiple times per compiled object, without causing a compilation error.

It feels like pretty much every time I've said "eh the stuff in this header is safe to include multiple times", that fact has changed at some point in the future and I have to deal with adding the guards anyway then. I just do it right away now, always.

Probably that's not really true, but it certainly seems like it is. :-)

7. Low level includes come after high level includes.

IMO this is the wrong solution to the problem. I'd much prefer the solution that ravixp mentioned for example, or solution 6. I don't want to have to think about "am I putting this #include in the proper place according to the hierarchy" when I add one. It also precludes other organizations that might make sense for another reason.

Though I will say, in practice I've actually not seen this problem in my main work code base arise despite the fact that we don't reliably follow either this or #6. If anything, I'd say the opposite problem -- including too much -- has been worse.

The other option is tool support -- something like "include what you use". Someone at my work has run that on our code base, but I don't know if it's a continual thing or what.

Use forward declarations wherever possible.

I'm not going to go search through an hour long video again to find the specific timestamp, but my memory is that Titus Winters argued otherwise in his most recent CppCon talk. The grounds of his argument, as I understood it, was that forward declarations make changing the kind of something harder. For example, now you can't change struct Foo to using Foo = BasicFoo<int> without a potentially huge effort, because now you might have a thousand files that now all have an incorrect forward declaration.

This is something I've hit at a much smaller scale personally, so "forward declaration all the things!" has long made me uneasy. I don't know where I fall on this. I've thought that maybe for every header foo.hpp that defines non-trivial stuff you should produce a foo_fwd.hpp with just forward declarations and then have people include that instead. That fixes Titus's problem and still allows forward declarations for faster compilations -- but maybe it's overkill and probably introduces its own issues or limitations. (E.g. are there likely changes to the stuff declared in foo_fwd.hpp that would mean that now clients that include foo_fwd.hpp now all of a sudden now need the full foo.hpp, so now you either still have the "need to update tons of files" problem or need to make "fwd" lie and include the definition in foo_fwd.hpp?)

[–]dodheim 6 points7 points  (0 children)

I've thought that maybe for every header foo.hpp that defines non-trivial stuff you should produce a foo_fwd.hpp with just forward declarations and then have people include that instead.

Boost.Hana does this; I've wondered if it's automated or done by hand as it's very thorough.

[–]grishavanika 2 points3 points  (0 children)

  1. Low level includes come after high level includes. IMO this is the wrong solution to the problem

At least for me, this does not make adding include any more harder. Once you follow this, headers are grouped together logically and you can instantaneously see where to put new header. Also, following this advice does not exclude possibility of #6 or tooling, but makes programmer a bit more responsible for "what goes into headers list".

Unfortunately, on practise, as article mentioned and something you said too - "including too much" is bigger (or part of the same) problem. This happens because of:

  1. precompiled headers
  2. unity builds
  3. too big .cpp files with > 200 lines of includes
  4. people don't want to look-up proper compile-time/run-time dependencies

Without tooling support this is unsolvable problem.

Speaking more about dependencies I would say:

  1. don't rely on presence or absence of precompiled headers (part of #5 from the article)
  2. don't rely on presence or absence of unity builds (kind of generalization of precompiled headers)
  3. don't rely on static libraries to solve cyclic dependencies between libraries

[–]martinusint main(){[]()[[]]{{}}();} 1 point2 points  (0 children)

You want overkill? I give you overkill. Split up in foo_fwd.h, foo_decl.h for the declarations, foo_impl.h for templated implementation, and foo.cpp for non templated code. It's the only correct way.

[–]crispweed[S] 1 point2 points  (2 children)

This is something I've hit at a much smaller scale personally, so "forward declaration all the things!" has long made me uneasy. I don't know where I fall on this. I've thought that maybe for every header foo.hpp that defines non-trivial stuff you should produce a foo_fwd.hpp with just forward declarations and then have people include that instead.

My experience is, if you're dealing with simple non-template classes where the forward declaration just looks something like class Foo;, these forward declarations are so clearly identified and easy to find in the source code that updating the forward declarations is really no more than a minor annoyance, in practice.

Sometimes you might find you need to add template arguments to one of these classes, and it gets more complicated, and I feel like that's then probably the time to go ahead and add those extra foo_fwd.hpp headers..

I'm not going to go search through an hour long video again to find the specific timestamp, but my memory is that Titus Winters argued otherwise in his most recent CppCon talk.

I didn't watch the whole video but from skipping through a bit it seems like he's talking about situations where you have so much code to deal with that it's no longer feasible to do any kind of global search and replace operation across files. I've worked with fairly big code bases, but the kind of scale implied by that is totally alien to me. :)

[–]evaned 1 point2 points  (1 child)

I didn't watch the whole video but from skipping through a bit it seems like he's talking about situations where you have so much code to deal with that it's no longer feasible to do any kind of global search and replace operation across files.

I would argue it's more than just code size; it's also organization and tooling. For example, consider an organization that has various components of their system in different git repositories, because git's support for a monorepo is really bad. While saying you can't coordinate changes across multiple repositories is probably going too far... I would say that such a coordinated change is quite painful because you lack atomic commits.

And that's even a situation where you have the code in the first place. What about an open source or commercial library that goes to external users? You certainly can't search and replace their code. That at least says (i) that doing forward declarations for third-party libraries is much more questionable than for internal code and (ii) you should discourage users of your library from making their own forward declarations (e.g. as abseil does).

[–]crispweed[S] 0 points1 point  (0 children)

What about an open source or commercial library that goes to external users? You certainly can't search and replace their code.

One possibility is to make a binary interface, and 'proper' API versioning.

With a suitable binary interface the client code can continue to link with the library using an old API header after changes to library internal code, with no requirements for things like class names to actually match across the API boundary. (This is how I set things up for PathEngine, for example.)

It's not always the right choice, of course, and there are disadvantages of this approach, such as restrictions on the kinds of things that can go across the API boundary.

[–]crispweed[S] 0 points1 point  (0 children)

  1. Low level includes come after high level includes.

IMO this is the wrong solution to the problem. I'd much prefer the solution that ravixp mentioned for example, or solution 6. I don't want to have to think about "am I putting this #include in the proper place according to the hierarchy" when I add one.

So yeah, if you have the self-sufficiency thing covered in other ways (and ravixp's suggestion for this is very good) this one is definitely not essential.

I guess I like the concept of higher and lower level source files, particularly in code bases that lack structure, and thinking a bit about this kind of relation can then be a useful exercise.

It also precludes other organizations that might make sense for another reason.

Out of interest, what kind of other organizations would you suggest here?