>btw, your question assumes that Common Lisp is not evolving. This is not correct at all. Which improvements Clojure evolving consists of?
Traction for one.
>Transducers? Cl got them too
Pointing to some random libs that implement the same concept doesn't mean it's part of a languages culture/ecosystem/common practice as transducers are to closure.
E.g. I could say "Rails? Well, language X has a rails like framework too", but that wouldn't mean you get the same benefits of using that framework over rails, when you include language adoption, community vibrancy, availability of programmers to hire, tooling, books, etc.
The whole point of common lisp, the reason it has survived since the beginning of time, is that the language was designed so that language innovation happened by people developing libraries for it. Complaining that common lisp hasn't evolved is like complaining that free-software posix systems haven't evolved. True, the posix standard hasn't evolved much, but its not meant to, it's meant to be evolved by the userspace programs and libraries that people write.
Yes, this approach is not the best approach to getting mass adoption, and there are benefits to mass adoption — people can help you, there's a bigger pool of people to hire, etc (I think its much more important to companies than individuals). But mass adoption isn't always the goal. I suppose you can consider Ruby analogous to Ubuntu while common lisp is like a *bsd distribution (or arch, gentoo, slack...). Ubuntu is great, all Ubuntu installations start from the same starting point, and the potential for customization is limited (try changing Ubuntu's init system), so you can just ask a question on stack overflow and get an answer with instructions specific to how your system works. And companies are more likely to pick Ubuntu than gentoo, because its better supported and supposedly more stable. That doesn't mean that people consider the distributions where the user can pick and choose exactly what their system does (like a common lisp programmer does by picking out libraries of macros) to be any worse of a distribution. Quite the contrary, the mythos seems to consider these distributions better in some way, like they're a better tool for a more skilled practitioner, a sentiment which has also found its place around common lisp.
>The whole point of common lisp, the reason it has survived since the beginning of time, is that the language was designed so that language innovation happened by people developing libraries for it.
And that "whole point" hasn't worked very well as to the availability of a rich library ecosystem and/or adoption. So, while it's a nice trait to have, it's perhaps too much of a burden.
>But mass adoption isn't always the goal.
True, but it's not like CL was designed like some researchy or experimental project that doesn't care about mass adoption.
>E.g. if you have a good language without a community, community can eventually grow, but you can use the language anyway. If you have a bad language with great community - it doesn't reduce your suffering from the language itself.
My experience has been the inverse.
I'd rather use a bad, or usually mediocre language WITH community, tooling, libs, books, programmers etc, than a better one where I'll be burdened by the lack of all of
these.
So, in a sense, community tramps inherent language qualities.
After all that's the very basic message from Lisp and Smalltalk.
Agreed. Clojure isn't overall a bad language, but it has annoying quirks in a lot of places, presumably because it needs to be compatible with Java. The ones that annoy me the most are (0) `nil` and `false` being different, and (1) wonky arithmetic that uses Java's numeric types under the hood, unless you explicitly ask it not to.
Racket is not Lisp, and I mean this as praise, not as an in "boo hoo, not Lisp, ergo bad". Racket's macro system, in particular `syntax-parse` and `define-syntax-class`, makes it a joy to extend the language. When you use a Racket macro wrong, you get an error message in terms of the abstraction the macro provides, not in terms of what the macro expander compiles it to, which is an implementation detail you have no business knowing or caring about. `defmacro` doesn't even begin to get close to the level of abstraction that Racket macros provide.
One reason is that there are projects/situations where language stability can outweight the benefits of language evolution. While this isn't the coolest view out there, the fact the the ANSI standard didn't need to be revised since 2004 can actually be a feature, not a bug. For an extreme example of my point, look at the state of the front-end JavaScript frameworks in the last couple of years.
There's evolution and there's running in circles. :-p
Few languages really evolve fast in a meaningful sense. The only examples that come to my mind are Haskell, Racket and pre-1.0 Rust. And the first two are research languages. Breaking changes (e.g. Python 3) to get rid of accumulated cruft can be a good thing, but nevertheless stability is a feature.
Thanks. Any suggestion about with which one to start? I don't believe that I'd use them for work or for many personal projects, but I've read about the advantages of learning FP. So I'd like to learn something that teaches me to think about programming in a different way and be able to apply those insights into the code I write in Java or other procedural or OO languages used at work.