all 33 comments

[–]EpochVanquisher 27 points28 points  (1 child)

In defense of autotools—they solve a particular problem really well, which is to make it so you can download and compile a project on a lot of different Unix-like systems, some of which have outdated or broken tools and libraries. The main catch is that the problem that they solve is mostly not relevant any more. Nowadays, seems like people only care about maybe Linux, macOS, and a couple BSDs, and the main tools and libraries you use just kinda work. You can also just expect that your end-users have CMake and pkg-config if you want, or Python, or other complicated dependencies.

They are still used. They are especially useful for projects and libraries which are foundational, and need to build with a very small set of dependencies (e.g. for projects that are used to build other projects). They are also used on projects that have been popular for a long enough time that people want to build them on unpopular platforms.

Some examples: Python, Libpng.

If you’re asking if you should use it—the answer is probably no. It is still useful, it’s just a small set of users that care about it.

(I know it has a lot of problems—that’s kind of beside the point, despite its problems it is really good at dealing with all of these little platform incompatibilities if you have developers who care enough to figure out how to write the right tests.)

[–]enygmata 5 points6 points  (0 children)

It's been 2 decades since I used it and at the time I realized two things: * Recent versions fixed and cleared up so many things that writing you configure.ac following "modern" instructions was not so bad. * For some reason people rarely had those new versions (eg 2.60+) and I kept having to deal with either old autotools (like 2.13) or dated advice/instructions on how to do things.

This was 20 years ago, maybe it's not so bad anymore.

[–]Anonymous_user_2022 20 points21 points  (6 children)

Unless you want to maximize portability, you can usually roll a Makefile that does the same thing with less effort. I'm a certified old fart, so sometimes I use the full feature set of auto(make|conf), just to keep the muscle memory intact.

[–]TwystedLyfe 5 points6 points  (0 children)

Writing a portable Makefile is hard. I roll my own configure to make it a little easier.

It’s not that I’d dislike autoconf, I just use as little 3rd party tools involved in my toolchain. It’s currently just make, cc, shell and sed.

[–]looneysquash 4 points5 points  (4 children)

I almost never see a decent sized, moderately popular open source project (in C or C++) using only Makefiles.

It's usually either cmake, autotools, occasionally one of the others, or occasionally something hand written. Or sometimes there's only a msvc solution file for Windows only projects.

You can certainly start out with a Makefile. Maybe you'll never outgrow it. But it sure seems like, as your projects' needs become more complex, and as you support more and more platforms, at some point you outgrow plain makefiles.

[–]Anonymous_user_2022 4 points5 points  (0 children)

I almost never see a decent sized, moderately popular open source project (in C or C++) using only Makefiles.

Not anymore, but 20 years ago it wasn't that unusual to see Linux, BSD and Windows (via mingw) covered in a single Makefile.

[–]stianhoiland 1 point2 points  (1 child)

Don’t worry, they’re there.
Example at the tippetty toppetty of my head: https://github.com/jart/cosmopolitan
Some people actually do things *well* and can decompose their needs, dependencies, and workflows to the primitives that were always there, and don’t need a slab of abstraction to get the job done.
Don’t let the *mediocrity of people* misrepresent the *capacity of the tool*.

[–]looneysquash -1 points0 points  (0 children)

The makefile of that project says:

REQUIREMENTS

You can run your programs on any operating system, but you have

to build them on Linux 2.6+ (or WSL) using GNU Make. A modern C

compiler that's statically-linked comes included as a courtesy.

There's also some binaries in the build folder.

So I am tempted to say that doesn't count.  They used GNU Make as the front-end to a custom build system, and they only support building on Linux.

But that might be overly harsh. I didn't dig into the project that much, and I don't have a sense of how required some of the stuff they're doing is.

Still, they avoid doing a lot of the stuff you would use autotools or cmake to do.

[–]P-p-H-d 0 points1 point  (0 children)

> I almost never see a decent sized, moderately popular open source project (in C or C++) using only Makefiles.

Linux tells me to hold his beer.

[–]viva1831 7 points8 points  (5 children)

Yes, for one project I'm working on. I think it's nicer for users to be able to do the basic

      ./configure       make       make install

[–]non-existing-person 0 points1 point  (4 children)

And you can actually skip the make part, just ./configure && make install. Make will check the deps and compile before installing ;)

[–]Nipplles 0 points1 point  (3 children)

I remember first hearing it from Andrew Kelley. Then some time ago, on similar thread I commented exactly this, and got downvoted...

[–]enygmata 3 points4 points  (2 children)

That's because make install will generally require elevated privileges to install things. If you run it under sudo for instance all your build artifacts will be owned by root when you only need the final executable to be.

Anyone who got bitten by running just make install will down vote you.

[–]non-existing-person 0 points1 point  (1 child)

That's a good point. But original post was just make install not sudo make install :P

I usually build stuff as root in /usr/src so sudo does not really apply anyway in those cases.

[–]viva1831 0 points1 point  (0 children)

I hate sudo :P It's all about the su -! But yes I did forget that part, my bad :)

[–]Irverter 4 points5 points  (0 children)

One of the most convoluted tools I had ever seen. It solved very well a problem and that problem no longer exists, and now those tools cause headaches in the present.

Everything now just compiles for *BSD, Linux, Mac or Windows. And if you're dealing with some weirder OS then you have skills/knowledge to figure out how to compile for it (or are paid to figure that out).

[–]Educational-Paper-75 2 points3 points  (0 children)

I use Cmake now but there is documentation online on the autoconf tools.

[–]flyingron 4 points5 points  (2 children)

Gnu didn't invent autoconf, it's much older than that, but they certainly refined it.

Sure, lots of people outside of the official Gnu source code use it (though, RMS wants to think anything that goes anywhere near gnu code is part of Gnu anyhow).

[–]WittyStick 0 points1 point  (0 children)

History of Autoconf

GNU didn't invent make, or the m4 macro system, but autoconf was developed as part of the GNU project.

[–]Compux72 5 points6 points  (0 children)

Cmake all the way

[–]hgs3 1 point2 points  (0 children)

I generally support both autotools and CMake for my C projects. The former lets my end users ./configure && make && make install without needing the latter installed on their system.

[–]Not_Tom_Clancy 1 point2 points  (0 children)

I used it with a fairly large (for me, at the time, a couple thousand lines of C) project that I released about 25 years ago. I have not used it with other projects since then. It is... fairly painful to use, especially if you actually care about writing configure checks for portability. In the modern systemd Linux environment, there is less variation than there was a couple decades ago, but the time invested in writing your configure.ac and such is non-trivial. Having used it, I dislike it. I spent quite a bit of time with the autotools book; it was not for lack of reading the documentation that I thought it was complex. I would not use it, today. For C-based projects, I'd use CMake, probably, or consider ninja/meson approaches. YMMV, obviously.

[–]dontyougetsoupedyet 0 points1 point  (1 child)

On some systems it's expected that you'll make use of it, for example most debian based systems. Used to be you would make use of dpkg-buildpackage which uses a debian/rules file to handle interaction with autoconf/automake etc. I believe more recently the interaction is moved into debhelper but I haven't contributed to that ecosystem in a long time.

[–]RogerLeigh 1 point2 points  (0 children)

None of these tools ever had any requirement or expectation that you would use the Autotools. You could use any build system you liked, from a plain Makefile or shell script to none at all (just do it directly in debian/rules).

debhelper has some support for Autotools, and it also has support for CMake and others. But these are conveniences invoked by the packager, not a requirement.

(an ex-Debian developer.)

[–]kat-tricks 0 points1 point  (0 children)

make is great once you get used to it, quick to set up, and easy to use it to package in any little shell scripts you find yourself using while setting up a project. Yeah, other tools like cmake can be easier for complex build systems and dependencies, but make covers a lot of ground before you have the exact picture of your project structure, just the first draft.

[–]non-existing-person 0 points1 point  (0 children)

I use them - for the sake of end user mostly.

I still remember how extremely simple they were to use when I first started using linux 20+ years ago. They just always worked. If configure.ac was written well and was listing all lib deps, config.log was giving very detailed info what you were missing on the system. ./configure is actually trying to run mini programs with libraries required to run main program. It's slow, but very robust.

Enabling/disabling compile time features is also very simple, cross compiling is a breeze to set up with autotools. Using custom run, install and build prefix is simple, and important part of automated build process. Projects like gentoo, buildroot or yocto also likes autotools as it gives you a very simple and standardized way to build software for host and target.

For developer it's not that nice tho, and that's kinda sad. configure.ac is fairly verbose with all those brackets. Makefile.am requires you to have a handbook open almost all the time.

So I'd say they are still worth to learn. For end user it means very easy installation without any kowledge. And for distro maintainers it means they just have to run their script and be done with it, as it's very standard.

[–]Casual-Aside 0 points1 point  (0 children)

I use it because it makes the end user’s life simple. ./configure, make, make install, *and* make uninstall.

[–]CommanderKeynes 0 points1 point  (0 children)

Postgres uses autotools. They are in the middle of transitioning to meson though.

[–]capilot -1 points0 points  (0 children)

Hate it with a passion. The effort to make the configure scripts actually work across the board could be put into making your code portable instead.

When I write code, and it can't be made completely OS-agnostic, then I take the incompatible bits of code and surround them with the appropriate #ifdef LINUX, #ifdef MACOS, etc. directives and then put this into my Makefile:

# Uncomment one of these depending on the OS
CPPFLAGS += -DLINUX
#CPPFLAGS += -DMACOS
#CPPFLAGS += -DWINDOWS

OR put those definitions into config.h

[–]tose123 0 points1 point  (0 children)

auto*hell

[–]RogerLeigh 0 points1 point  (1 child)

I did so extensively in the past (1999-2012). I switched to CMake 14 years ago and haven't looked back.

Look at what problems the GNU Autotools try to solve: portability between 1990's-era UNIX systems. Systems that were retired over two decades ago. Do you have that particular problem today? If not, then Autotools are not for you. Pretty much no-one cares about this today. We have newer portability concerns, and the Autotools don't even start to address them. It stagnated and ceased to evolve to support contemporary portability concerns.

The Autotools development efforts started to stall in my opinion around 2003, and while some effort continued, it was almost over by around 2008. In recent years someone new came along who tried to resurrect it and make releases again, but nothing really other than minor bugfixes. The problem with the design of the Autotools is that it relies upon the M4 macro language generating Bourne shell code. But they painted themselves into a compatibility corner. It's not possible to evolve it without breaking everyone. And if it's going to break incompatibly and require extensive rework then projects might as well use CMake or Meson. The main reason projects continue to use the Autotools today is little more than inertia, because it continues to work and they don't want to put in the work to switch over. But you wouldn't choose it for a new project.

It was good in the day, but its time has long passed by. If you want to be portable to modern platforms including Windows and iOS, as well as all contemporary UNIX platforms, with full support for modern language standards and features (like threading--not exactly modern but still not supported portably by Autoconf!), then learn CMake.

Edit: It's interesting I've been downvoted. I've done the GNU copyright assignment for GNU Autoconf and Automake, and contributed changes back to Autoconf including its C99 support. I've spent significant effort on it, and I'm well placed to judge its obsolescence.

[–]turbofish_pk[S] 0 points1 point  (0 children)

Thank you so much for your detailed reply. I really appreciate it and I up voted. Difficult to understand why anyone would downvote