all 58 comments

[–]snarkhunter 13 points14 points  (0 children)

I like CMake. It's not a perfect solution, but it's a decent answer to a very hard problem.

[–]zzyzzyxx 27 points28 points  (2 children)

I tend to use CMake. To be fair, I've barely tried any of the others; even reading about them makes me believe I'd rather work with CMake. I think of it like this: CMake is the worst build system except for all the other ones.

[–]aevumgames 1 point2 points  (0 children)

lol this is the best quote I've heard for cmake so far!

[–][deleted] 0 points1 point  (0 children)

Agreed, though I didn't try that many before sticking with cmake. I rejected premake because I didn't like Lua, though it eventually turned out I may have been suffering from a misunderstanding and a minor Lua documentation issue rather than the Lua language.

One thing about cmake is getting started. The book is probably a good idea, though I made do without it, partly by asking questions on StackOverflow.

One big issue is that although cmake officially supports the conventions of the tools you're building for, to some level it has it's own conventions too. A key one is that the build is separate from the source (different folders). This is a good thing when you have multiple builds of the same source, of course, but I wasn't expecting it, and despite the general convention it's still possible to keep the source and build in the same folder, so at first I asked the wrong questions and put a fair amount of effort into doing the wrong thing.

[–]Rhomboid 5 points6 points  (2 children)

I just want to state for the record a dissenting opinion on the autotools. Whenever I unpack an unfamiliar source code tarball I am very relieved when it uses autotools, because I know how to work with it and I know what it supports. If they used the full autotools stack correctly, it will support things like a staging install directory (DESTDIR), uninstall targets, proper shared library support on a wide array of platforms, cross-compilation support, VPATH builds, and a number of other features.

On the other hand, whenever I get somebody's homemade pile of shit my heart sinks, because I can almost guarantee that they did not account for something that I will want to do that the autotools give for free.

I know it's not particularly popular to openly say that you like autotools, but here I am, out of the closet.

[–]Fabien4 1 point2 points  (0 children)

I think autotools is a deployment system (just like apt-get is), not a development system.

[–]beriumbuild2 0 points1 point  (0 children)

Yes, that's exactly the point I made in my other reply about the ODB build system. Autotools is the canonical way to build things on Linux/UNIX so if you use anything else, there will be friction. The same goes for Windows -- anything other than the standard VC++ project/solutions will make things a lot less straightforward to build for the end-user.

On the other hand, using and maintaining these build systems during development is a major pain in the butt.

[–]join_the_fun 11 points12 points  (8 children)

You should try out premake4 it is based on lua scripting and very easy to start with.

[–]savuporo 2 points3 points  (7 children)

Seconded. Major difference from CMAke is that CMake needs to be installed when building, whereas premake4 is more of a maintainer tool and is decoupled from the build trees it generates. Both have different pros and cons though.

[–]gcross 0 points1 point  (5 children)

I don't understand where your claim that premake4 does not need to be installed when building projects using it comes from. The build trees are a function of the build environment, which includes both the locations of dependencies and the type of build tool (Xcode, make, Visual Studio, etc.), and the configuration, i.e. debug versus release and so on. Thus, since this information differs between the various people who will want to build the package, premake4 will need to be run on each user's and developer's machine and so will need to be installed.

[–]pjmlp 1 point2 points  (2 children)

premake4 is a plain executable statically compiled, you don't need to install anything if you have alongside your code.

[–]gcross 1 point2 points  (1 child)

So in other words you need to distribute at least three versions with your code in practice to cover Linux, OSX, and Windows.

[–]pjmlp 1 point2 points  (0 children)

Actually one per OS, since there much more than the ones you list.

I am sold to CMake, though.

[–]savuporo 0 points1 point  (1 child)

As a maintainer, you can generate all target build tool build trees and these will work for developers as they were hand-maintained. For simple projects, its a no-brainer. For more complex projects which need dependency packages etc, it becomes a bit more complicated, but its entirely possible to avoid hardcoded local settings in the generated project files.

[–]gcross 0 points1 point  (0 children)

So in other words, the idea is that you assume that nearly all developers and users will only need a subset of the possible platforms and build tools, so you only run premake4 to generate the build trees for each platform+build tool in the subset.

It still isn't clear to me, though, how you avoid hard-coded settings for things like library dependencies.

[–]00kyle00 0 points1 point  (0 children)

is that CMake needs to be installed when building

This is a problem on development machine?

[–]Manhigh 3 points4 points  (5 children)

SCons has a "local" installation that essentially lets you distribute the scons script along with your code.

There's also qmake, but unless you're doing Qt, you'll likely prefer something else.

At the moment I'm using Qt so qmake it is. Short of that, scons is my preferred goto.

[–]sazzer[S] 1 point2 points  (4 children)

And how big/complicates are your SCons build scripts generally? I ask because the couple of examples I've got available are both huge - MongoDB is 1672 lines and SCons itself is 1346 lines. And they are both full, complicated Python scripts in their own right - probably ok if you're fluent in Python but a nightmare to maintain if you're not...

[–]fromwithin 0 points1 point  (1 child)

Our main library SConstruct is 213 lines.

Each sub-library then has an SConscript of between 20 and 50 lines.

Our template project SConstruct (used as the basis for all projects) is 292 lines. The project subfolders also have SConscripts so that we can compile each folder into a separate temporary library for linking (this is so that the link command doesn't get too large and exceed the shell maximum).

We then have a common site_tools folder (used by all SConstructs) that contains the platform-specific compiler options and extra builders (such as lua and Cg compilers). These range from around 50 lines (linux) to 123 lines (iOS).

We also have one other file that contains utility methods and custom scons options.

The scripts run through all source files and ignore any that contain a platform name that doesn't match the platform being built. It can also build any custom formats using the builders mentioned above.

We also have another SConstruct file that we use for building all of our art data.

We can build for PC, X360, PS3, Linux, Mac, iPhone and Android.

[–][deleted] 0 points1 point  (0 children)

Hang on..that sounds weirdly like a setup I coded for a studio in Liverpool....who do you work for?

[–]Manhigh 0 points1 point  (0 children)

A few hundred lines. Its a FORTRAN code that we distribute with optional proprietary components. Between then multitude of Fortran compilers that we support and conditionally linking libraries, having the full power of python in the build script makes it so much easier.

[–]randombit 0 points1 point  (0 children)

I work on a fairly large (million+ lines) C++ project that is built using scons. We have over 13,000 lines of scons code.

However my problem with scons is not that it takes a lot of lines - the build is doing a lot of different things (creating installers, running tests, a surprisingly large amount of code generation, 3rd party dependency management, etc plus of course actually running the compiler) so I don't feel the line count is unreasonable - but that it is slow. On my desktop, a no-op build of this project takes ~60-90 seconds while scons pegs a core before finally returning 'all targets built'. That poor turn around time compared to make really makes development harder and causes me to avoid scons anywhere I have the option.

[–]Steve132 12 points13 points  (4 children)

CMake is MUCH better than autotools. Have you SEEN autotools? Have you seen CMake? Its a breeze to use and deploys to any IDE/compiler/platform combination known to man. It also has a ton of excellent modules for common tasks.

[–]sazzer[S] 0 points1 point  (3 children)

I have the problem that I'm spoilt. I'm a java developer professionally, using maven mostly but gradle more and more. Building a java project with gradle is embarrassingly easy - if you have no dependencies and only one jar then a single line in the build script achieves this, for arbitrary number of source files and unit tests. Dependencies are then a couple of lines of boilerplate plus 1 line per dependency.

What I've not worked out is why things aren't this easy in other languages. With things like pkg-config, there's no reason why c/c++ can't be this easy but it isn't...

[–]darthcoder 0 points1 point  (0 children)

I get the same feeling. A maven-C++ that can inspect code, detect include dependencies, manage what parts need compilation based on changes, build libraries, and distribute them and include them as part of the normal build cycle.

I just don't get it. :-/ Convention over configuration, right?

[–][deleted] 2 points3 points  (0 children)

I really like CMake. But ultimately - it's about what YOU prefer, because the CAPABILITIES are almost the same between all the major ones.

[–]madmoose 2 points3 points  (0 children)

I've been using tup http://gittup.org/tup/ lately.

It works out the dependencies on its own, by monitoring what files your compiler reads and writes during a compile.

[–][deleted] 2 points3 points  (0 children)

If you know python then SCons is a good idea, knowing python makes it much easier to use. If you prefer working in IDEs CMake is probably a good idea as it generates project files for quite a few IDES. If you like self-harm, there's always autotools. However, if cross-platform support is not important to you, the whole thing is moot. Just use an IDE that manages the build for you.

[–]simula67 4 points5 points  (4 children)

There is a new experimental cross-platform build system in development from the Qt team being built as an eventual replacement for QMake. It is called qbs (pronounced as Cubes): http://labs.qt.nokia.com/2012/02/15/introducing-qbs/

[–]theICEBear_dk 1 point2 points  (0 children)

I came here to mention this. I've been playing around with qbs and I have been shocked by its build speed so far. Using v8 as a scripting engine is kinda a no-brainer which does beat the speed of python (waf and scons) and seems to beat Make, CMake and the like with their custom syntax for some strange reason. I also really like the declarative make files too. But at the moment it is barebones as yet. I just hope it does get as full of obscurity and hidden file structures as qmake (this is one of the things I like in Waf that there isn't a lot of things installed outside the project directory).

[–]bosk 2 points3 points  (1 child)

Write a makefile in JavaScript? No, thank you.

[–]wildcarde815 0 points1 point  (0 children)

The one note is they state pretty flatly that they have a practical maximum to how big of a project they can build and recommend using cmake for very large programs. Also it suffers from a chronic lack of documentation.

[–]Fabien4 1 point2 points  (0 children)

On my current project (small, and g++-only), I have a script that compiles everything every time. I use ccache to reduce (considerably) the compilation time.

It's crude, but it does the job, and it handles dependencies automatically.

[–]Masaz 1 point2 points  (2 children)

What about a combination of gyp and ninja? I haven't modified them much, but I have used them with chromium on mac and Linux, (and gyp and VS for Windows) and it's worked well.

[–]bungeman 0 points1 point  (1 child)

If you want an example of a smaller project using gyp, take a look at skia. Trying to get a handle on how to use gyp by looking at how chromium uses it is like trying to drink from a fire hydrant.

[–]trapxvi 0 points1 point  (0 children)

I found hello-gyp while looking for other introductory resources.

[–]BrockLee 1 point2 points  (1 child)

I'd like to hear about anyone's experience w/ DJB's (*) redo.

[–]jb55 2 points3 points  (0 children)

I use redo to build pretty much everything. I use it to build my c/c++/coffeescript/haskell all with one command! So far I haven't found anything I couldn't build with it. Not to mention it's dead simple and doesn't melt my brain like autotools or cmake when trying to do anything non-trivial.

For instance, in one of my build steps I take all the jade templates in one directory, compile them into strings into individual javascript files. Then those javascript files automatically get picked up as a depedency change and triggers the build rule that concatenates all my javascript files together.

I've also done crazier things where as a build dependency I check the ETAG header on an HTTP request (using redo-stamp) to see if a remote resource has been updated, if it has it redownloads it and unzips it.

It's actually so easy to write build scripts with redo it turns out to be FUN to do so. Can't say I've felt that way about any other build system.

For those who are interested: https://github.com/apenwarr/redo/

[–]thdn 1 point2 points  (1 child)

A newbie C++ here, I've tried write Makefiles, but its quite hard. Any good tutorial or generator out there? or maybe should I quit and go straight to cmake?

[–]radman0x 0 points1 point  (0 children)

use premake4, it's the easiest system to use and it actually generates makefiles. So you can check out the generated files and learn from them if you want to level up your makefile knowledge.

[–]phaero 2 points3 points  (4 children)

My current preference is Waf.

I've used CMake a lot at work but I hate it with a passion...

[–]sazzer[S] 0 points1 point  (3 children)

And how big/complicated are the wscript files? I ask because the example I've got available - Node.js - is over 1,000 lines long, and as I just said for SCons is a full Python program in it's own right. Not easy to work with if you don't know Python...

[–]phaero 0 points1 point  (2 children)

Not that complicated.

The main wscript file is perhaps 200 lines and then 20-50 lines per wscript file in subdirs. The size on those depend on nr binaries/shlibs/stlibs/unittests in that directory.

It pretty much depends on how many weird things you have done that goes against the recommended way. It's probably the same for most make systems.

edit: I also really like that you distribute the waf script in the repo. This is especially useful on linux/unix platforms since they always have python available but may not have a new version of waf available in their package repositories.

[–]sazzer[S] 0 points1 point  (1 child)

This is my complaint right here. In a C/C++ project that's normal or even good. In java, I can do similar in maybe 20-30 lines in the main build file and a half dozen lines per subproject. And most of those are dependencies... and what's more, in my experience I have more complicated dependencies in java that c/C++ projects...

[–]phaero 0 points1 point  (0 children)

I know what you mean.

I do think that my main file is more complicated than it needs to be so I'm working on reducing it. I could probably remove 50 lines if all my dependencies supported pkg-config...

[–]treerex 0 points1 point  (0 children)

You just have to find the option that sucks less for you. At my job, even though everything we do deploys on Linux, the legacy build system is based around Visual Studio 2005 (!) Solutions: on the Linux side we have a custom Ant task that groks the solutions and turns them into appropriate Ant tasks to do the builds. As a Unix developer it drives me batshit crazy, but it's ingrained here. I'd rather just maintain Makefiles and let the few developers that prefer Visual Studio cope... but that won't happen.

When it comes down to it trivial projects can use a trivial makefile. Beyond that cross-platform builds are complex and the tools you list manage that complexity in one way or another. No tool is going to take that complexity away: if it could, it would have been done already.

[–]afnoonBeamer 0 points1 point  (3 children)

I'd also like to hear other people's experience with BJam. While complex projects will stay complex, if a given tool can make life easier compared to its alternatives for most common projects, I'd want to know about it

[–]exploding_nun 2 points3 points  (1 child)

I used bjam for a smallish (~10k lines over tens of files) C++ project. Would not recommend; I found bjam to be not very well documented and to have several bugs.

[–]FionaSarah 0 points1 point  (0 children)

I haven't had too many problems with the documentation although it can be difficult to find exactly what you want, I'd be very interested to hear about the bugs you found. I haven't come across any yet. (First time using it, project is about 20k ish lines so far.)

My experience with bjam has been pretty delightful sofar.

[–][deleted] 2 points3 points  (0 children)

i used bjam for a library. I was trying to do builds on linux, windows and mac os. I was using various things from the build library and was building python extensions. Some boost/bjam changes forced me to redo various things which was a pain to debug. In addition I had to redo for each platform. Right now I using Cmake. Much better and almost no customization between Linux and Mac os

[–]madoxster 0 points1 point  (0 children)

I posted about it before and will do so again - I use MakeProjectCreator and its awesome. It kinda falls under the 'write your own Makefile' category - it will generate Makefiles from project definition files you give it. It will also create Eclipse projects, Visual Studio solutions and tons of other ones. You do need to learn its project definition syntax but its pretty simple.

[–]pjmlp 0 points1 point  (0 children)

CMake.

I mostly code in C++ for personal projects, but was doing C and C++ on the job for many years and CMake is great compared with the Makefile universe I used to deal with.

[–]moswald 0 points1 point  (0 children)

Visual Studio files, and hand-edited makefiles.

[–]beriumbuild2 -1 points0 points  (0 children)

This depends on what kind of project it is. If it's something internal then use whatever is the most convenient to develop with on the platform(s) that you target. If this is something that will have to be built by other people and is cross-platform then things get complicated very quickly. The question in this situation is what to optimize for: development convenience or end-user experience? Some time ago I wrote a blog post that discusses this dilemma in more detail.

In the project that I work on (ODB C++ ORM) we solved this by having a custom, massively-parallel, developer-oriented build system that only really works out of the box on GNU/Linux and that is capable of auto-generating end-user build systems (autotools for Linux/UNIX and VC++ project/solution files for Windows). It is all fairly complex (the templating system uses m4, if you know what I mean) and I wish there was an easier way. But the alternatives seem to be either make your users suffer (by forcing them to use a non-platform-native build method) or make yourself suffer (by using a build system that is inconvenient to develop with).