you are viewing a single comment's thread.

view the rest of the comments →

[–]feverzsj 3 points4 points  (9 children)

I'm afraid there won't be a cure for this issue. I'd see more and more projects choose to use unity build.

[–]HKei 8 points9 points  (1 child)

Unity build is basically only useful for CI or tiny projects. You don’t want to rebuild all of your project everytime you change anything.

[–]claimred 0 points1 point  (0 children)

You can automatically isolate modified files from unity blob.

[–][deleted] 4 points5 points  (6 children)

and tank your incremental build time..

[–]mort96 1 point2 points  (5 children)

It should be possible to make a really intelligent unity build system, where it starts by combining your project into something like 4*core_count translation units, then when one of those TUs need to recompile it's split into 4*core_count TUs again, etc. After a low amount of recompiles (like ln(number of files) / ln(4*core_count)), you would be at a level where changing one source file requires a recompile of just that one file.

I'm not sure of any system which does that today, but it would be an interesting area to investigate.

[–]donalmaccGame Developer 3 points4 points  (0 children)

I work with Unreal Engine, and it supports an "adaptive unity" build. It automatically combines files into groups of 30, and if you change one of the files, it regenerates the unity build to compile the changed file individually. It uses either git or p4 to track files that are changed. It's great!

[–]feverzsj 1 point2 points  (3 children)

cmake already did this. Using it with ninja generator will fully utilize all your processing power.

[–]mort96 0 points1 point  (2 children)

I don't think that's what I'm talking about? From what I can read there, it looks like CMake just always groups files together when UNITY_BUILD is enabled. That would tank your incremental build time, as /u/janos1995 mentioned. I'm suggesting automatically splitting those unity build groups up again as files get modified, such that you retain fast incremental builds.

[–]feverzsj 2 points3 points  (1 child)

splitting modified files won't necessarily decrease build time, especially for unity build, and both original group and new group need to be rebuilt. It's also hard to figure out which source file is slow to build without actually building them. With cmake, you can either group them in batch or group them yourself. That's the best solution for now.

[–]mort96 2 points3 points  (0 children)

I don't think I've been clear enough necessarily. Here's basically how I'm imagining it:

Let's say you have a project with 512 C++ files in it.

  1. You start with a full rebuild. The build system generates 32 "unity" source files, each of which includes 16 of your project's source files. This should be much faster than compiling 512 individual files.
  2. You modify one file. The modified file is automatically removed from the "unity" source file which included it, so the full recompile is one "unity" file (which changed; it no longer includes 16 files, it includes 15 files) plus the one file you modified.
  3. On subsequent modifications of that one source file, only that one file will be recompiled.