Distributed Compilation, a Programmer's Delight 60
cyberpead writes in with a Developerworks article on the open source tool options that can help speed up your build process by distributing the process across multiple machines in a local area network.
Bulk building is more effective (Score:4, Informative)
Due to a strange quirk in the way compilers are designed, it's (MUCH) faster to build a dozen files that include every file in your project than to build thousands of files.
Once build times are down to 5 - 15 minutes you don't need distributed compiling. The link step is typically the most expensive anyway, so distributed compiling doesn't get you much.
Minor error (Score:5, Informative)
Re:Minor error (Score:3, Informative)
Maybe there's some special cases, but I've never had to have a shared source repository in order to use distcc.
They also say the machines need to be exactly the same configuration, and they do elaborate on that a little bit, but it's not strictly true. Depending on the source you're compiling, you might only need to just have the same major version of GCC.
Re:What about Excuse #1? (Score:3, Informative)
Re:distcc has one fatal flaw (Score:4, Informative)
In pump mode, distcc runs the preprocessor remotely too. To do so, the preprocessor must have access to all the files that it would have accessed if had been running locally. In pump mode, therefore, distcc gathers all of the recursively included headers, except the ones that are default system headers, and sends them along with the source file to the compilation server.
Preprocessing in C (Score:5, Informative)
I guess you are refering to the preprocessing step of C and C++ compilers, which was really a lame hack, I think. If you have a lot of include files, preprocessing produces large intermediate files, which contain a lot of overlapping code, that has to be compiled over and over again.
Preprocessing should have been removed a long time ago, but nasty backwards compatability issue, it was never done. Other languages, such as Java and D, solve this problem in a much better way. Just as did TurboPascal with its TPU files in the late 1980's.
Re:Minor error (Score:3, Informative)
Re:Preprocessing in C (Score:2, Informative)
It's not sufficient for large projects; disk I/O is still a very large overhead when compiling. Switching to a 'Unity' build scheme reduced compile times significantly (more-so than the distributed compile solution we used since it still had to read the files off disk multiple times in addition to sending them over the wire to multiple machines). .CPPs and .Hs make up about 110mb on our project.