About Tcl and the embedded world, a few years ago I wrote a single-file Tcl interpreter that is mostly compatible with current Tcl implementations (something is missing like namespaces, something was added like anonymous functions):
If one looks at the EDA industry you will have to appreciate the breadth of applications Tcl is used in. Usually there is a Tcl interpreter with custom commands written in C for optimization.
The ease with which one can quickly create interfaces between different tools I think is one of the contributing factors to its success. That and of course, the legacy codebase.
You can get lisp to interact more like a console language. Write a parser to accept python-ish whitespace syntax (whilst still allowing parens syntax), and transform that s-expressions so you can pass to the interpreter
To imply:
(define (fn a b)
(list (car a) (car (car b))))
You could type:
define: fn a b
list:
car a
car (car b)
As it's a console, in practice you'll mostly be typing commands like this:
load x y z
but you have good mechanisms for going deeper. And if you just want to write in s-expressions you still can.
The aside in the linked article about avoiding shifting and parens just to type a simple command is even more true once a command spans multiple lines. Say what you will about syntactic attempts to remove the parens from Lisp, I think it's a secondary issue for a command language.
There are some good observations about design pressures on command languages vs. programming languages in Olin Shivers's "A Scheme Shell", as well.
IMHO, nothing beats Tcl/Tk when you need to write a small distributable application with a GUI. The last ones I wrote were for purging files acording their date and for compiling Forms/Reports. With freewrap I can build an executable with no effort at all.
Another recommendation for your reading list: The Plan 9 kernel (could be considered a modernized version of a classic code reading which is also highly recommended: Lions' UNIX 6th Edition commentary).
The Go compilers and stdlib are also a great read. Is no coincidence that Ken Thompson was involved in all those projects.
More on topic: I agree it is sad how underrated Tcl is, perhaps not the best language ever created, but deserves much more attention and credit than the latest JS-flavour-of-the-week.
Tk is also great and IMHO still the best portable GUI toolkit around.
Tk is great. What I don't get is why people are still making applications that don't use ttk and it's modern theme. It's Motif look on linux is the single biggest complaint I hear about it today and that problem was solved what, a decade ago?
As an aside, LTK, which is basically just lisp bindings for talking over a socket to wish is the only lisp gui that I've had work on all platforms and all lisp implementations.
I don't have a proper list, it is more in the back of my head. I did read some of these each:
* Erlang VM
* sqlite
* Linux kernel
* Clojure functional data structures
That list may look very pretentious, but I don't try to fully understand the packages. What interests me is (1) the style, and (2) the larger strokes. What kind of idioms are there, and how does it fit together. I think it is pretty hard to get the idea of a software package from reading the code. For that some higher level description or book makes more sense. Generally I think reading source code is very helpful in becoming a better programmer. I don't think the above list is anywhere near a general recommendation. I did rather pick a ___domain that was intimidating to me, and the reduce ignorance there.
I spend a lot of time using Tcl (I have to) and I spend almost as much time wishing I could be using Perl instead. This is partly because of weirdness, as described by the author of this article, and partly because I routinely run into things that seem to be far more difficult to do in Tcl, than they ever would be in Perl.
Can I get one of the Tcl adherents in this thread to comment- have I got things completely wrong? Need I simply become better with Tcl? Or does it truly lack things like powerful string handling and hash tables?
Depending upon what you do, you could be using Perl, with the help from the Tkx (for GUIs) or Tcl (for about everything else) CPAN modules and use TCL liberally from within Perl.
I'm not as fond of the language as some, primarily because it is unnecessarily hard to debug errors and unnecessarily hard to read code and data.
For example the interpreter can basically say "there's a syntax error somewhere in this giant block of yours" and not have any idea where.
A related issue is that a single delimiter (the brace) is bad for readability, even if it's easy to parse. Just try reading a gigantic dictionary; it isn't anywhere near as clear as Python/JSON-style can be (where there are obvious commas, colons, quoted strings and more to show you exactly what you're looking at).
Another headache for debugging is that commands frequently accept the names of variables. In unfamiliar code you can't even answer simple questions like "where are all the places 'xyz' is used?" because a reference to 'xyz' could be hiding almost anywhere. You can change something and not fully understand the effects that your change could have. While I'm sure this allows for very clever code to be quickly written, it fails the more-important test of producing code that is easy to read.
It is even possible for commands to silently accept mistakes and seem correct, with major consequences. Suppose "x" just happens to be a one-element list containing a number, '{0}'. With this input, "lindex 0 $x" is legal (even though "lindex $x 0" is the correct argument order). Worse, the wrong form returns something that seems reasonable without error. In this case what it actually does is grab the index that it found by magically looking inside the list ($x) and return one of the values from the implicit list containing "0" that was given on the command line; it returns a "0" but not the "0" that you'd think it does!
While one can argue that each language has "best practices" and intended usage, the examples above are largely pulled directly from built-in commands and data types. These are things that people encounter all the time, they are not caused by any particular TCL programming practice.
Circa 1999, I wrote a simple reminder script - which I still use - that brings a popup on the desktop, in like 3 lines of Tcl/Tk using the wish shell. I shed a tear at how easy it was compared to anything else at the time. I am not even sure if it's that easy with anything else today.
Tcl is one heck of a Swiss-army knife solution. The core team is of the first order. The best thing is that it's so easy to use pipes to integrate a 'C'/C++ console mode program and control it via stdin/stdout/sockets. That turns out to be a significant architectural plus - rather than using some MVC C++/Java type library, you get complete seperation of concern. You don't even have to integrate it tightly with 'C' code; just use stdin/stdout and wrap a Tcl package to build a pipe around it.
Tcl might have been too easy. I also recall ESR dismissing it at one point in favor of Python.
Isn't that what Ousterhout and his students designed Tcl for? I fooled around some with AOLServer years ago, and have used it with Oracle's "Intelligent Agent".
But I didn't know how much I liked it until I Googled for "PHP upvar" and found a forum thread with bunch of people telling some guy that he was a language fascist for wanting such a thing.
What's often not as exposed is that Tcl was node.js before there was node.js (or, well, Twisted, POE etc.) . It has a pretty powerful event loop (moved out of Tk into the core language originall, if I remember correctly), and it's very easy to tie in scripts to add functionality. So it became quite popular for system that had to communicate with lots of different devices, gathered information from a lot of sources etc., centralized information processing hubs. Nevermind that it was very easy to add GUIs to monitor it or enter information. Back in the days, quite a few companies had pretty huge Tcl-based infrastructure, and despite the lack of support for "programming in the large", they were surprisingly easy to decipher and extend.
I still don't know why Tcl had problems entering the WWW age. It started out pretty well, the aolserver was/is a pretty capable and performant system and they even had browser plugins (i.e. as a Java rival), but once the plethora of web services hit the landscape and we all went "Web 2.0", the lack of a proper CPAN equivalent and the slightly outdated Tk look caused some exodus to more hip scripting languages. The sad thing is that since quite a while Tcl overcame all those troubles (modules, packages, native look and feel), but probably a bit too late.
I also wish I understood why Tcl was not more widely used in the early days of the Web. HTML is a string-based protocol, and Tcl is optimized for handling strings -- it's an absolutely natural fit.
Bugzilla, for example, was originally written in Tcl; but for reasons I have never been able to discover, it was completely redone in Perl. Anybody here know why?
Well, Tcl enjoyed a bad reputation for a while, mainly due to the "everything's a string" mantra leading some to believe it's not much use beyond that, RMS's flame war against it (which is why we're all using Guile now), and mostly the simple, but slightly odd syntax (expr, array and list functions).
Then you've got to remember that for a while, Sun owned Tcl. Pretty much the same period when they were hyping Java…
Also, the core Tcl distribution was still focused on scripting (embedded or not) and GUI programming with Tk. Modularity back in the days often involved separate interpreters with added featurs (TclX for example).
Compare that to PHP: Basically the same situation regarding additional modules, but they had everything related to simple web development in one package - parsing requests, handling pictures… You could use PHP to do scripting and even GUIs, and you could use Tcl to do web development. But once you're written off as a niche language, it's hard to escape that trap.
We've got a whole bunch of interpreted languages out there, but only a few of them are considered all-purpose languages. Ruby and Python mostly, even Perl has to fight a bit against the Unix scripting preconceptions.
Tcl wouldn't be the first scripting language of that period "lost". Anyone remember Pike or Icon?
I used to really like Tcl and it have one the nicest communities but I think it needs two things
1- a CPAN
2- Killer app (and no aolserver or openacs are not)
I work for a medium-large technology company supplying mission critical software (very large amounts of money are involved). We use Tcl for the vast majority of our software. We may head towards Java in the future, but not so much due to any failings of Tcl, more because it is easier to recruit Java programmers.
I think the general consensus at our company is that Tcl was the right choice, although our programmers do occasionally gripe at some Tcl shortcomings.
I love the TCL foreach loops. TCL is the only language that lets you iterate through multiple data structures with the same loop. It is actually really useful. Just try this statement with any other language:
foreach animal { "dog" "cat" "bird" } color { "brown" "black" "red" } {
puts "See the $color $animal?"
}
(loop :for animal :in '("dog" "cat" "bird")
:for color :in '("brown" "black" "red")
:do (format t "See the ~a ~a?~%" color animal))
See the brown dog?
See the black cat?
See the red bird?
How far can we take the example?
(loop :for animal :in '("dog" "cat" "bird")
:for color :in '("brown" "black" "red")
:for count :from 2
:do (format t "See the ~a ~a ~as?~%" count color animal))
See the 2 brown dogs?
See the 3 black cats?
See the 4 red birds?
How about this?
(loop :for animal :in '("dog" "cat" "bird")
:for color :in '("brown" "black" "red")
:for count :from 2 :sum count :into total
:do (format t "See the ~a ~a ~as?~%" count color animal)
:finally (format t "See all ~a?~%" total))
See the 2 brown dogs?
See the 3 black cats?
See the 4 red birds?
See all 9?
I never knew about this zip command. I guess it brings the same utility offered by the TCL foreach loops to python.
Still, I wish more languages directly imitated the TCL foreach loop. I've found that when I code, I wish I could just do some simple task 3-4 times. I am always tempted to create a little function, and to call that function repeatedly. But that seems like overkill. With TCL, I just write a simple loop statement instead of the function. I've found that my code is more compact, and the number of functions that I declare is reduced.
a:`dog`cat`bird
c:`brown`black`red
{`0:,//("See the ";$x;" ";$y;"?\n")}'[c;a]
The string formatting is a bit ugly because I'm not using a dedicated format function, but foreach with multiple data structures is just f'[list;of;arguments].
(doseq [[a c] (zip ["dog" "cat" "bird"] ["brown" "black" "red"])] (print "See the " c a "?"))
Haskell:
mapM_ (\(a, c) -> putStrLn $ "See the " ++ c ++ " " ++ a ++ "?") (zip ["dog", "cat", "bird"] ["brown", "black", "red"])
edit: one can do way more complicated stuff with the list monad in Haskell and the very comprehensive core lib in Clojure; I'm sure even these examples can still be made a little bit shorter and more readable.
a scheme repl could borrow a leaf from tcl's book by preprocessing everything you typed in to let you (a) omit the outermost parens and (b) mix () and [] freely. would be an interesting experiment to see if it had a better feel for this sort of interactive scripting
Tcl is a bloody good language+ but seems to have languished in obscurity. I can't count the number of times I've heard people in the Python and Ruby (and Lua and Processing, and node.js, for that matter) communities announce some wizzy new thing that Tcl had 5 or 10 years earlier. I've never understood why that is. The problem with the computing community in general is that it can't see over its own shoulder. Every few years, we reinvent the wheel.
+ By this I mean you can get a lot done, in not very much code, and maintain it afterwards, and read and modify other people's code easily.
You're preaching to the choir, comrade. My first professional manager back '99 taught me the same thing - programming is trendy, the same concepts keep showing up in new languages and packaging. Learn the concepts and you'll be set for as long as you stick with this career.
Even Alan Kay has been complaining about it since 1997 [1]. Looks like even our complaints get recycled.
Ok, so let's enumerate those concepts, stick 'em up as a guide for future generations (probably a good idea to work out what they are ourselves)
1. Code is data, data is code
Implications: everything should be a first class citizen of the language, able to be passed by reference, altered by functions elsewhere.
Examples: s-expression in lisp? Subclassing built-in types of oo language (python new style objects)
2. Configs are applied to code, so that the same code can run unchanged on dev beta and live.
(this is my favourite reinvented wheel du jour - devops. In fact the whole 12factor app recently on HN is a great example of rediscovered wheels - or perhaps more fairly a great example of doing this guide to future generations too)
3. iPads are no way to type into unresizeable text boxes
Let's say you wrote that list, now in 2012. In 2015, people will look at the date on the blog posting and immediately disregard it because it's "old". That is the problem. The knowledge hasn't been lost, over the last 30, 40, however many years. It is being actively shunned by people who take all the complexity involved in a useful system utterly for granted and think they don't need experience or to study the experiences of others.
I find it fascinating, it's as if kids these days believe that their fathers had all the things they take for granted, like smartphones, tablets, AJAX, whatever, and then, because they were stupid, chose to use green screens and FORTRAN instead.
I think why we disregard old material is because we were bitten by that while we were learning programming in the first place. The web is full of old cruft that no longer applies, and some of it flat out dangerous, so we shun older information in favor of newer information.
Haha, you are proving my point "the web" indeed, this problem has been going on for long before there was "the web", tho' the web has certainly made it worse, everyone thinks they're using the latest greatest thing, they don't even realize that it's just yesterday's leftovers with different buzzwords attached. That's the tragedy here. And who is to say the new information is any better? At it's very best, it's just repeating what has gone before. Why not go to the original, and learn for yourself?
A programming language runtime is a whole package; just because some elements of previous system designs are reused does not mean that we are "reinventing the wheel."
To pick Lua (my favorite of the technologies you listed), it has the following things that Tcl does not have:
- strict ANSI C implementation
- small implementation (15k sloc vs. 110k-160k for Tcl, depending on how you count)
- a fast implementation, with an even faster JIT
If you follow your argument to its logical conclusion then everything is just a reinvention of Lisp (including Tcl). But we have tools available to us today (like Lua) that are strictly better than Lisp or Tcl for some use cases.
No, it's not. It started off that way, but a modern lisp is rather far away from "just an implementation of the lambda calculus". Named functions, for one, are a significant departure.
From the article: Use Lua for scripting. Use Tcl for command line interaction.
He has a point there. Lua is nice enough to write, but it is `func(arg, 'strarg')` where Tcl is appearantly `func $arg arg`. If I have to type that into an interactive prompt 200 times, I'd take the latter. If I want to write some complex program, I'd take the former. In other words, Tcl is optimized for writing, Lua is optimized for reading.
Lua is designed specifically to provide a scripting/extension language for projects in C or C++, or projects where interop with C is key. This is where it really shines -- where most Lisps expect to be the primary language, but have FFIs to C for performance and integration reasons, Lua expects to be used for writing plugins, configuration, and the like. It handles those tasks well, and otherwise stays out of the way.
While it's good as a standalone language, it's an outstanding extension language. It's small, trivially portable, and has excellent interop with C. It's not competition for Lisp in general so much as Emacs Lisp or vimscript, designed for systems that have a core written in C but need a lot of customization layered on top.
Configuration for window managers, scripting enemy behavior for game engines written in brutally optimized C++, that kind of thing. Systems with a "hard and soft layer" architecture (http://c2.com/cgi/wiki?AlternateHardAndSoftLayers). (Incidentally, Tcl was also designed with this use case in mind.)
That said, if you're using Lua as your primary language but are proficient in C, you can add a lot of power to it pretty easily. The emphasis on C interop goes both ways. (Also, the implementation is very intelligently engineered, and is worth study if you're interested in language design or virtual machines.)
Thanks for the comprehensive reply; however, a quick search turned up a couple of small, portable and embeddable Lisp dialects (Hedgehog, PicoLisp, ECL) that all seem to target the same niche Lua is apparently designed for. Admittedly, Lua has probably been developed for a longer period of time, but small Lisp interpreters shouldn't be very complex, so I guess they are pretty stable.
Lua is probably going to be faster, but one wouldn't exactly use a scripting language for math heavy stuff etc either.
Lua has had a long time to slowly and deliberately evolve, informed by feedback from a gradually growing group of users.
It's kind of the opposite of Javascript, which went from 0 to release in a few weeks and had a bunch of design errors become permanent as a result. Lua and Javascript have a lot in common, and studying Lua may be a good way to see where Javascript is going in the long term.
Lua is significantly smaller than ECL, and a bit smaller than PicoLisp. According to sloccount, they weigh in at 432,818 (ECL), 15,018 (PicoLisp), and 14,309 (Lua).
Lua has the following advantages over Lisp-the-language:
- only one dialect (sometimes changes between major versions,
but there aren't multiple dialects evolving in parallel).
- infix notation is preferred by many programmers, and is easier
to use for non-programmers.
- the language is small (more like Scheme than Lisp)
The Lua and LuaJIT implementations have the following advantages over SBCL (you didn't say which Lisp, so I'm just picking what seems to be the most popular one):
- extremely portable
- extremely small
- easy to sandbox
- very good integration with existing C and C++
A personal opinion: Lua comes close to having a predictable control flow, as more program structure stuff is built in to the language. One of the headaches I had with Tcl was maintaining heaps of scripts that implemented structural features (OO, object references, list comprehensions, to name a few that come to mind) in different ways _because you can_. Doing things the elegant way can sometimes be the enemy of doing things the predictable way.
This is strictly a limitation of languages like Lua, but in day-to-day use it sometimes becomes an advantage.
The library/packaging system was a disaster - at least in the early versions.
Half the libs needed a source code change to the tcl interpreter. There were two main add ons (I want to remember one was called green eggs and ham???) which were incompatible, so if your lib needed A and another bit needed B you were stuck.
While it was a disaster, the current Tcl package implementation surpasses many other languages. It's possible to have multiple versions installed and have different runtimes choose a version they want. The biggest advantage though is binary version independence through stubs (http://wiki.tcl.tk/stubs). Compile a binary package for 8.x, it will work in 8.y where y >= x.
Yes this was back in the early 90s and I don't even remember what they were called. I don't know if it was really a deficiency of TCL that you had to patch the interpreter to enable library features or the writers were just lazy.
One thing it did have was very easy to write C plugins because everything is a string
Productivity is highly valued. It's easier to be highly productive at reinventing wheels than it is to figure out which existing wheel fits a given problem.
It seems to be trivial productivity. People like to say they've contributed and it's easier to port trivial libraries to new frameworks. I'm amazed everytime I see XYZ ported to framework de jour so celebrated.
Copying is the foundational intelectual activity. Arguably copying enables natural language and culture. The world wide human network is engaged in a large scale genetic algorithm founded on copying. Porting successful features serves as an effective filter for bad ideas. Good ideas thrive and are copied everywhere, bad ideas get few copies and eventually die in obscurity.
I did my first professional programming in Tcl/Tk used by a trading system. It was awesome and even though most people have not heard of it when I talk to friends/clients, glad to see this post.
You can do that in base Tcl; the libraries won't exactly help you and unless they've fixed the dynamic scoping problem it had a decade ago when I last looked at it closures would be obnoxious, but the language itself is really clean and simple, very nice. The implementation is horrible and the original designers didn't really understand what they had, which is why you have really weird libraries like Tk that don't compose especially well (briefly: in Tk essentially every GUI object has global scoping and are named; the naming reflects the scope. It's possible to write GUI libraries for it but it's not trivial).
http://jim.tcl.tk/index.html/doc/www/www/index.html (source code is here: https://github.com/msteveb/jimtcl)
Now maintained by Steve Bennett, and actively used by some embedded folks.