Hacker News new | past | comments | ask | show | jobs | submit login

I've spent a fair amount of time over the past decades to make autotools work on my projects, and I've never felt like it was a good use of time.

It's likely that C will continue to be used by everyone for decades to come, but I know that I'll personally never start a new project in C again.

I'm still glad that there's some sort of push to make autotools suck less for legacy projects.






You can use make without configure. If needed, you can also write your own configure instead of using auto tools.

Creating a make file is about 10 lines and is the lowest friction for me to get programming of any environment. Familiarity is part of that.


It's a bit of a balance once you get bigger dependencies. A generic autoconf is annoying to write, but rarely an issue when packaging for a distro. Most issues I've had to fix in nixpkgs were for custom builds unfortunately.

But if you don't plan to distribute things widely (or have no deps).. Whatever, just do what works for you.


Write your own configure? For an internal project, where much is under ___domain control, sure. But for the 1000s of projects trying to multi-plarform and/or support flavours/versions - oh gosh.

It depends on how much platform specific stuff you are trying to use. Also in 2025 most packages are tailored for the operating system by packagers - not the original authors.

Autotools is going to check every config from the past 50 years.


>Also in 2025 most packages are tailored for the operating system by packagers - not the original authors.

No? Most operating systems don't have a separate packager. They have the developer package the application.


Yes? Each operating system is very different and almost every package has patches or separate install scripts.

To extend on sibling comments:

autoconf is in no way, shape or form an "official" build system associated with C. It is a GNU creation and certainly popular, but not to a "monopoly" degree, and it's share is declining. (plain make & meson & cmake being popular alternatives)


I've stopped using autotools for new projects. Just a Makefile, and the -j flag for concurrency.

cmake ftw

Or meson is a serious alternative to cmake (Even better than cmake imho)

CMake also does sequential configuration AFAIK. Is there any work to improve on that somewhere?

Meson and cmake in my experience are both MUCH faster though. It’s much less of an issue with these systems than with autotools.

Just tried reconfiguring LLVM:

    27.24s user 8.71s system 99% cpu 36.218 total
Admittedly the LLVM build time dwarfs the configuration time, but still. If you're only building a smaller component then the config time dominates:

    ninja libc  268.82s user 26.16s system 3246% cpu 9.086 total

You mean cargo build

... can cargo build things that aren't rust? If yes, that's really cool. If no, then it's not really in the same problem ___domain.

No it can't.

It can build a Rust program (build.rs) which builds things that aren't Rust, but that's an entirely different use case (building non-Rust library to use inside of Rust programs).


There's GprBuild (Ada tool) that can build C (not sure about C++). It also has more elaborate configuration structure, but I didn't use it extensively to tell what exactly and how exactly does it do it. In combination with Alire it can also manage dependencies Cargo-style.

Got it to build C++, CUDA and IIRC SYCL too.

cmake uses configure, or configure-like too!

Same concept, but completely different implementation.

still slow, even if it does multiple processes - lots of "discovery" that are only on your machine, or whatever the CI machine had in mind. Instead it should be - "You should expect this, that and that", and by nature of building it - it fails.

Discovery is the wrong thing to do nowadays.


I don't think I'd want a "fail late with a possibly unclear error" build system. There is also the problem that finding the path of dependencies happens at the same time as finding if you have them, and removing one without the other doesn't seem to be very useful.

At best, I think you could have a system that defers some / most dependency discovery until after configure time, but still aborts the build with "required libfoo >= 0.81.0 not found" if necessary.

And no, you are not going to be able to tell everyone exactly where everything needs to be installed unless it's an internal product.


Sure, to all to their needs. In my case I want reproducible build, but to start I first want build that at least passes the same compiler flags around, and links same things no matter what the computer is (it can still be different when comes to OS).



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: