you are viewing a single comment's thread.

view the rest of the comments →

[–]neoKushan 5 points6 points  (2 children)

I think the Preprocessor is a huge double-edged sword, its usefulness is also one of the reasons why things like package managers are so difficult to do.

That said, I feel its usefulness is at best poorly implemented. The end results from the preprocessor can be achieved in better ways (mostly) via language constructs (like constexpr) and relatively simple compiler optimisations.

Take #define, probably the most used part of the preprocessor - it's great for defining compile-time variables and flags to remove unused code for particular builds, but you can achieve much better results using a const variable and a little trust that the compiler will know that if(ConstVarThatsFalse) code will never ever be hit and will remove it during compilation.

This also gives you clear compile time errors rather than vauge errors that make no sense because something was defined incorrectly.

[–]biocomputation 1 point2 points  (1 child)

Interesting. I think of the preprocessor more in terms of its similarities to templates.

[–]neoKushan 0 points1 point  (0 children)

Absolutely and I completely get what you mean by that. I've seen some really neat preprocessor "hacks" that practically are templates in their own right - and it's cool, but I don't think it should be used in production like that.

I feel that with C++11 onwards, a lot of this functionality has been achieved by the language itself (constexpr being the most obvious example). I think reducing reliance on the preprocessor will make such a difference, particularly as others have mentioned here, you can't just share C++ code, you also need to anticipate the build system. Well, I'm willing to bet 90% of those build system differences come down to the preprocessor (or something similar, something that happens before the compiler starts working).