Hacker News new | past | comments | ask | show | jobs | submit login

Tcl is a bloody good language+ but seems to have languished in obscurity. I can't count the number of times I've heard people in the Python and Ruby (and Lua and Processing, and node.js, for that matter) communities announce some wizzy new thing that Tcl had 5 or 10 years earlier. I've never understood why that is. The problem with the computing community in general is that it can't see over its own shoulder. Every few years, we reinvent the wheel.

+ By this I mean you can get a lot done, in not very much code, and maintain it afterwards, and read and modify other people's code easily.




You're preaching to the choir, comrade. My first professional manager back '99 taught me the same thing - programming is trendy, the same concepts keep showing up in new languages and packaging. Learn the concepts and you'll be set for as long as you stick with this career.

Even Alan Kay has been complaining about it since 1997 [1]. Looks like even our complaints get recycled.

1. http://video.google.com/videoplay?docid=-2950949730059754521


Ok, so let's enumerate those concepts, stick 'em up as a guide for future generations (probably a good idea to work out what they are ourselves)

1. Code is data, data is code

   Implications: everything should be a first class citizen of the language, able to be passed by reference, altered by functions elsewhere.  

   Examples: s-expression in lisp? Subclassing built-in types of oo language (python new style objects)

2. Configs are applied to code, so that the same code can run unchanged on dev beta and live. (this is my favourite reinvented wheel du jour - devops.  In fact the whole 12factor app recently on HN is a great example of rediscovered wheels - or perhaps more fairly a great example of doing this guide to future generations too)

3. iPads are no way to type into unresizeable text boxes

... Edits coming when I get a keyboard


Let's say you wrote that list, now in 2012. In 2015, people will look at the date on the blog posting and immediately disregard it because it's "old". That is the problem. The knowledge hasn't been lost, over the last 30, 40, however many years. It is being actively shunned by people who take all the complexity involved in a useful system utterly for granted and think they don't need experience or to study the experiences of others.

I find it fascinating, it's as if kids these days believe that their fathers had all the things they take for granted, like smartphones, tablets, AJAX, whatever, and then, because they were stupid, chose to use green screens and FORTRAN instead.


I think why we disregard old material is because we were bitten by that while we were learning programming in the first place. The web is full of old cruft that no longer applies, and some of it flat out dangerous, so we shun older information in favor of newer information.


Haha, you are proving my point "the web" indeed, this problem has been going on for long before there was "the web", tho' the web has certainly made it worse, everyone thinks they're using the latest greatest thing, they don't even realize that it's just yesterday's leftovers with different buzzwords attached. That's the tragedy here. And who is to say the new information is any better? At it's very best, it's just repeating what has gone before. Why not go to the original, and learn for yourself?


Who is people and how do you know they do that? I don't think people do that. I feel like Hanlon's razor applies here.


The constant re-inventing of the wheel is the proof. The state-of-the-art hasn't really advanced in 25 years.


A programming language runtime is a whole package; just because some elements of previous system designs are reused does not mean that we are "reinventing the wheel."

To pick Lua (my favorite of the technologies you listed), it has the following things that Tcl does not have:

  - strict ANSI C implementation
  - small implementation (15k sloc vs. 110k-160k for Tcl, depending on how you count)
  - a fast implementation, with an even faster JIT
If you follow your argument to its logical conclusion then everything is just a reinvention of Lisp (including Tcl). But we have tools available to us today (like Lua) that are strictly better than Lisp or Tcl for some use cases.


And following the argument deeper, Lisp is just an implementation of lambda calculus.


No, it's not. It started off that way, but a modern lisp is rather far away from "just an implementation of the lambda calculus". Named functions, for one, are a significant departure.


From the article: Use Lua for scripting. Use Tcl for command line interaction.

He has a point there. Lua is nice enough to write, but it is `func(arg, 'strarg')` where Tcl is appearantly `func $arg arg`. If I have to type that into an interactive prompt 200 times, I'd take the latter. If I want to write some complex program, I'd take the former. In other words, Tcl is optimized for writing, Lua is optimized for reading.


An honest question: for which use cases is Lua strictly better than Lisp?


Lua is designed specifically to provide a scripting/extension language for projects in C or C++, or projects where interop with C is key. This is where it really shines -- where most Lisps expect to be the primary language, but have FFIs to C for performance and integration reasons, Lua expects to be used for writing plugins, configuration, and the like. It handles those tasks well, and otherwise stays out of the way.

While it's good as a standalone language, it's an outstanding extension language. It's small, trivially portable, and has excellent interop with C. It's not competition for Lisp in general so much as Emacs Lisp or vimscript, designed for systems that have a core written in C but need a lot of customization layered on top. Configuration for window managers, scripting enemy behavior for game engines written in brutally optimized C++, that kind of thing. Systems with a "hard and soft layer" architecture (http://c2.com/cgi/wiki?AlternateHardAndSoftLayers). (Incidentally, Tcl was also designed with this use case in mind.)

That said, if you're using Lua as your primary language but are proficient in C, you can add a lot of power to it pretty easily. The emphasis on C interop goes both ways. (Also, the implementation is very intelligently engineered, and is worth study if you're interested in language design or virtual machines.)


Thanks for the comprehensive reply; however, a quick search turned up a couple of small, portable and embeddable Lisp dialects (Hedgehog, PicoLisp, ECL) that all seem to target the same niche Lua is apparently designed for. Admittedly, Lua has probably been developed for a longer period of time, but small Lisp interpreters shouldn't be very complex, so I guess they are pretty stable.

Lua is probably going to be faster, but one wouldn't exactly use a scripting language for math heavy stuff etc either.


Lua has had a long time to slowly and deliberately evolve, informed by feedback from a gradually growing group of users.

It's kind of the opposite of Javascript, which went from 0 to release in a few weeks and had a bunch of design errors become permanent as a result. Lua and Javascript have a lot in common, and studying Lua may be a good way to see where Javascript is going in the long term.

Lua is significantly smaller than ECL, and a bit smaller than PicoLisp. According to sloccount, they weigh in at 432,818 (ECL), 15,018 (PicoLisp), and 14,309 (Lua).


Lua has the following advantages over Lisp-the-language:

  - only one dialect (sometimes changes between major versions,
    but there aren't multiple dialects evolving in parallel).

  - infix notation is preferred by many programmers, and is easier
    to use for non-programmers.

  - the language is small (more like Scheme than Lisp)
The Lua and LuaJIT implementations have the following advantages over SBCL (you didn't say which Lisp, so I'm just picking what seems to be the most popular one):

  - extremely portable
  - extremely small
  - easy to sandbox
  - very good integration with existing C and C++


A personal opinion: Lua comes close to having a predictable control flow, as more program structure stuff is built in to the language. One of the headaches I had with Tcl was maintaining heaps of scripts that implemented structural features (OO, object references, list comprehensions, to name a few that come to mind) in different ways _because you can_. Doing things the elegant way can sometimes be the enemy of doing things the predictable way.

This is strictly a limitation of languages like Lua, but in day-to-day use it sometimes becomes an advantage.


From that list of languages, how does Lua / LuaJIT performance compare, generally speaking?



The library/packaging system was a disaster - at least in the early versions.

Half the libs needed a source code change to the tcl interpreter. There were two main add ons (I want to remember one was called green eggs and ham???) which were incompatible, so if your lib needed A and another bit needed B you were stuck.


While it was a disaster, the current Tcl package implementation surpasses many other languages. It's possible to have multiple versions installed and have different runtimes choose a version they want. The biggest advantage though is binary version independence through stubs (http://wiki.tcl.tk/stubs). Compile a binary package for 8.x, it will work in 8.y where y >= x.


Yes this was back in the early 90s and I don't even remember what they were called. I don't know if it was really a deficiency of TCL that you had to patch the interpreter to enable library features or the writers were just lazy.

One thing it did have was very easy to write C plugins because everything is a string


Productivity is highly valued. It's easier to be highly productive at reinventing wheels than it is to figure out which existing wheel fits a given problem.


It seems to be trivial productivity. People like to say they've contributed and it's easier to port trivial libraries to new frameworks. I'm amazed everytime I see XYZ ported to framework de jour so celebrated.


Copying is the foundational intelectual activity. Arguably copying enables natural language and culture. The world wide human network is engaged in a large scale genetic algorithm founded on copying. Porting successful features serves as an effective filter for bad ideas. Good ideas thrive and are copied everywhere, bad ideas get few copies and eventually die in obscurity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: