Hacker News new | past | comments | ask | show | jobs | submit login

GNU Make also embeds GNU Guile, a criminally underused feature:

https://www.gnu.org/software/make/manual/html_node/Guile-Int...




In practice, Guile is usually not compiled in. Whereas I've never seen a version of make without `load` and its supporting infrastructure.


Debian gives you the option, with make and make-guile equivalent packages. IIRC Slackware simply compiles it in (guile already being there) and Fedora/RHEL leave it out entirely.


Yes. It really should be "make" and "make-noguile", but we have what we have. In practice, you actually want the "remake" package. This is a fork of "make" that's a drop-in replacement (since it is the same code), has guile enabled AND contains an interactive debugger.


I wish rmake was in the standard codebase for most distros. It’s absurdly helpful.


It is criminally under-integrated!

The only interface you get into the make internals from Guile is a function to expand a make expression, and a function to eval a makefile fragment.

These interfaces only encourage the use of Guile for nothing more than make metaprogramming, an area where more power is not needed.

Imagine if Guile access to the rules as data structures or something. And the graph of targets to be updated and what not.

Imagine if Guile could be used, say, to override the logic by which a target is considered out of date with respect to its prerequisites.

.


100% agree. With the current API, there's no real advantage to using Guile over a C extension (other than the choice of language). If the Guile interface could hook into Make's dependency graph, it would be huge game-changer. But as it is, the Guile interface is basically a fancy macro processor.

The Guile interface can expand Makefile expressions into values, and can process/evaluate Makefile statements. But there's no way (that I've found) to do something like "remove this dependency from target X", ask "what dependencies does X currently have?", or ask "do you think X is currently up-to-date?".


I would say there is no advantage of using built-in Guile to call make's eval API over:

  $(eval $(shell <arbitrary-command>))
where arbitrary-command could run ssh to a virtual machine, running a docker container loaded with Common Lisp ...

The Guile approach stays in one process, but that has no value in the context of make, which launches external programs as its principal paradigm.


There is some benefit to Makefile processing speed, if that's a metric that affects your build.

With one or two $(shell) calls, it won't matter at all. If you start to approach dozens or hundreds of calls, the extra overhead of all those shells can to be noticeable. Especially if your Makefile is automatically triggered on file changes or something.


>Imagine if Guile could be used, say, to override the logic by which a target is considered out of date with respect to its prerequisites.

That is exactly what I was recently hoping for.

Make-as-library is such a compelling idea that I feel like it must have already been done, but I searched for something like this recently and the closest I found was Java's Ant, which gets the as-library part but sadly has no concept of "target is already up-to-date"...


TIL thank you!


I sometimes wonder if we would even have autotools or cmake if people just knew about this one simple trick


Autotools is designed to solve one very important problem: how do you build the GNU tools in the first place if all you have is some obscure Unix from the 1980s. If you already have gnu make, gnu bash, gnu binutils, gnu coreutils, etc. installed then autotools is pointless.

I have yet to find evidence of cmake solving a problem (or even having design), though I guess `ccmake` would be kind of cool if it weren't full of nonsense?


One of the other things that autotools does that cmake does (admittedly badly) is provide a "configure" step that gives a much more controllable interface into enabling or disabling features of a program.

The problem with autoconf in particular is that it spends a lot of time trying to paper over the compatibility issues of ancient Unixes, whereas modern portability tends to rely more on a concept of a portable abstract system layer. The latter means that most of the work a configure step needs to do isn't "does your system have $UNIX_SYSTEM_CALL" but instead "what OS is this."


I see it both ways.

On the one hand, sure Windows vs macOS vs "Linux/BSD" is so very different, you mainly need OS detection.

On the other, I don't need a list of the exact featuresets of Fedora, Debian, FreeBSD, NetBSD, OpenBSD, DragonBSD, MorphOS, etc. so I can write narcissism-of-small-differences shit like:

   #if defined(SUNOS_V4) || defined(_MSC_VER) || defined(GNU) || defined(FREEBSD)
   # include <strings.h>
   #else
   # include <string.h>
   #endif
I will _gladly_ run a feature test and write:

   #if HAVE_STRINGS_H
   # include <strings.h>
   #endif
   #if HAVE_STRING_H
   # include <string.h>
   #endif
That is soooooooo much cleaner


Speaking of, I am using https://zolk3ri.name/cgit/m4conf/about/ so I do not have to use anything more bloated or complex.


This came up with musl, IIRC, and was problematic: the existence of a named header doesn't guarantee it contains what you think it does


You can configure things with cmake! All you need to do is

1. Figure out what varibles you want to change

2. Add a -DVAR_NAME=value parameter to the command-line!

...which sucks to do.

Meson is a much better way of doing things, but even that falls into "Here are the supported flags and settings, and then here are variable names for everything else"


Even with all the GNU tools available there are still a lot of system-specific things that you may need to know: supported C version, supported C++ version, how to invoke the compiler, correct compiler flags for warnings / desired C or C++ version / etc, where to install things, how to install them and set the right owner and permissions and many many more. Autotools (and cmake) can figure all of that out for you. If you operate in a monoculture and, for example, only deal with a single Linux distribution on a single architecture most or all of this may not be relevant for you. But if you target a more diverse set of environments it can save you a lot of headaches.


I see Autotools as sort of cool if you're porting to a new platform - the tests find out what works and if that's not enough then you have to make some effort to get around it. If you're lucky, however, you put your autotooled code on some completely new OS/hardware and it just builds.

Nowadays the proportion of people who are porting the code is probably much smaller but it's still a way of keeping code working on unix with various compiler, architecture variations.

IMO cmake just includes windows more effectively - for autotools you'd probably be forced down the cygwin route. I find it a bit easier to work with but it's still a nightmare from hell sometimes.


Though there's also gnulib, which as part of the autotools process simply replaces system functions with their own stubs. It was a great idea, briefly, and then it became a fiasco.


Make's BIG problem (IMO of course) is that the commands are executed in the system shell.

If make supplied it's own shell language, a simplified one, then everything would be fantastic.

For one thing, cross platform builds would be much easier to get working as there would be no issue about "is the default shell bash or ash or sh or dash or ksh or whatever" and on Windows there would be no need to use cygwin.

The other thing is there would not need to be such a huge clash between the way expansion works in shells versus make which is very confusing when you combine the two.


> If make supplied it's own shell language, a simplified one, then everything would be fantastic.

We did exactly that in build2, a modern make re-thought. And we provide a bunch of standard utilities like sed, find, etc., that work the same everywhere, including Windows. Here is an example of a non-trivial recipe: https://github.com/build2/libbuild2-autoconf/blob/17f637c1ca...


There's nothing stopping you from specifying an explicit shell in your Makefile (see: https://www.gnu.org/software/make/manual/make.html#Choosing-...).

You could set it to Ash, Bash, Perl, Python, execline, Docker exec, or whatever you want really. You can also set that variable on a per-recipe basis, if you have one recipe that needs a custom shell.

(note to any GNU Make maintainers who might see this: it would be really helpful to be able to set .ONESHELL on a per-recipe basis as well!)


I always do in fact do that. SHELL:=/usr/bin/bash or whatever

The problem is that bash isn't there on windows and not even on all Linuxes so if I supply you with a makefile there's no way to be sure you can use it.

If make included a language then I don't need to worry about your dependencies - not even which version of bash you have. This would cause a makefile to be FAR more useful.


Or even emacs, AS Amy Grinn showed last Werk at Fosdem.

https://fosdem.org/2025/schedule/event/fosdem-2025-5139-org-...


Yes, but where do you stop? In make shell one would routinely call rm, sed, find... should they be included too? So instead of make including a shell, it would be simpler if busybox included a make.


> In make shell one would routinely call rm, sed, find... should they be included too?

If you want to use it as a build tool, yes. The most successful build tools build hermetically, with the build definition specifying the versions of anything and everything that is needed to perform the build. (With Maven you even specify e.g. what version of the Java compiler to use, and it will download and use that version).

> So instead of make including a shell, it would be simpler if busybox included a make.

Can busybox be used portably from e.g. a user's homedir? I've only ever seen it used as the main system in /bin etc..


Well, I believe make is already the most succesful build tool, at least on every platform I'm caring about (ie any variant of unices + few more).

What you are describing looks like packaging more than building. Pinning the versions of everything is not the build tool job.

I understand the standpoint of software publishers who want to limit the number of environments they have to suport, but proprietary software is not the use case that every tools should be optimizing for.

When, a nix user or a gentoo user decides that she wants this version of library X with this version of library Y, that's not make's job to overide her decision, is it? We need some flexibility.


> When, a nix user or a gentoo user decides that she wants this version of library X with this version of library Y, that's not make's job to overide her decision, is it? We need some flexibility.

The user should absolutely be able to override it, but library X's build system should have some defaults and changing them should be a deliberate choice. "Build against whatever happens to currently be installed, and hope you get lucky and wind up with something that works" is not a great build experience.


Not for Windows




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: