In my first job out of college, I worked for a company that built industrial test equipment (machines that snapped airplane wings in half, shook cars around like they were driving down a bumpy road, that sort of thing).
The tool we used to write the supervisor code and GUIs for machine operators was a language called Alltalk. It was inspired by Smalltalk and developed by an old graybeard at the company. Reading the description of Squeak/Smalltalk brought back some fun memories. A few interesting highlights:
- Alltalk ran in its own VM. You could create objects, change their state, and save the entire stack/image back to the original VM. Running the VM again would pick up execution right where it left off. This let someone do crazy things like email an Alltalk image to another engineer and say, "Here's the machine state halfway through an emergency shutdown. The actuators all came to rest in 10ms, but it's taking way too long to shut down the hydraulic pumps. Any ideas?" and the engineer could run the VM and debug away.
- The Alltalk VM had its own object inspector and terminal/interpreter. So you could walk up to an Alltalk instance connected to a live machine with motors spinning at 20krpm, for example, open up the inspector, and code away. Realize you need a low-pass filter on the motor speed feedback sensor? Create the object, tweak the parameters, and wire it into the signal chain live.
Another avenue for learning is to start from hard problems. A hard problem is a problem that is very hard without using a known technique or solution. Somebody who's unfamiliar with it has no clue on how to even start solving it. Solving such a problem will add new techniques to your tool-belt for solving a whole class of problems.
Problems that I've found worth studying:
- Parsing. This is hard if you don't know the standard techniques: mostly recursive descent parsing with precedence -- or a parser generator, but that won't teach you much unless you write the generator yourself. Moreover I've found that you quickly run into limitations with parser generators and they don't make parsing easier in the long run, that's probably why most real world parsers are hand written (e.g. gcc, llvm). You can probably hack something together that sorta works, but knowing the right techniques makes writing an efficient parser that you have high confidence in a breeze. I've seen many ad-hoc parsers that would have benefited greatly from proper parsing.
- Constraint solving. Examples are solving sudoku, the zebra puzzle, SAT solving, optimal register allocation and many others. The three most important techniques are backtracking, constraint propagation and conflict learning. Even professional programmers can't do this if they don't know these techniques (e.g. http://xprogramming.com/articles/oksudoku/).
- Interpretation/Compilation. In addition to being a hard problem, writing an interpreter and a compiler makes you understand how programming languages work. This is sometimes seen as a black art, but it is quite easy once you got the pattern.
- Numerical algorithms. The speed and the simplicity of numerical algorithms is astonishing. Newton's method is particularly amazing. In about 10 lines of code you can solve equations, do state of the art numerical optimization or solve differential equations. It can even handle problems that are usually considered to require specialized algorithms, like max flow and linear programming. It can also perform arithmetic operations, like division and square root.
- Machine learning. This gives you a different perspective on problem solving. Many people when presented with the task of spam filtering or determining the language a snippet of text is written in will respond with a hard coded scheme based on a lot of rules. A much better approach is to take a data set of known text and analyze the statistics of the data set and compare that with the snippet of task text. The same applies to many other problems.
I find this list quite striking. Actually I've not seen those particular topics all listed in one place before.
I have to admit I am slightly interested in the psychology of that list. Would this list be immediately on the mind of someone who teaches computer science for a living, or would it be a list made by a self-taught programmer who happens to have studied all of these topics or is it just received wisdom that these are complex programming tasks?
I think if I knew something about each of these, at least enough to say something non-trivial I would explore them together in a blog or article. Somehow I find the list interesting because in some way it represents where computing is headed in the next few decades and it should therefore inform the design of languages of the future (yes I accept that it has informed the design of some of the languages of the past).
Great list. I found this comment more interesting than the OP, personally.
The best addition to the list that I can think of is natural language processing. Extracting information from text, especially. There's a ton of interesting pieces, and most of them are useful for one kind of large-scale internet application or another.
That leaves numerical algorithms and machine learning, which I agree are useful to understand anyhow and different programming languages offer little leverage ;)
Haskell and parsing combinators takes an enormous bite out of the parsing problem, too. I won't call it "solved" but it brings it down to the point that writing a parser for a custom minilanguage is more like "a day's work" than "a month's work".
At the level they are supported at least in one language they are not hard in many other languages e.g., `import something` solves many established problems in Python. http://xkcd.com/353/
Mercury (relative of Prolog) greatly simplifies constraint solving and parsing as well. Packrat parsers for PEGs (a kind of unambiguous grammar) are a natural consequence of Mercury's support for DCG notation and memoization. Mercury also has builtin support for user-defined constraint solvers.
Most languages are "worth checking out." Every language offers something unique, by definition. The question is: which languages are "worthier" than others? Or, rather: how should you prioritise your learning queue?
I don't think the author's list provides much variation. They're all important languages, to be sure, but I believe you can get better mileage for your time.
My ten recommedations, in recommended order, are:
- Racket (nee Scheme - why did they have to change the name!?)
- Haskell
- Java (reading the GoF)
- C, and the POSIX libraries and system calls
- Go
- Javascript or Lua
- Smalltalk or Squeak
- Erlang
- Forth
- Prolog
Indeed it is an article I wrote in 2008! I was quite surprised to see it on Hacker News only now... oh well, maybe I should write another one at some point.
Things are changing, but not that fast I must say. Go is definitely the biggest omission on that list, but again, it didn't exist back then!
As far as I can tell not too much has changed in the language landscape since then. IO, and Factor seem to have stalled a bit. Scala and Clojure seem to have picked up a little steam. In the time since then Go is the only language to make enough of a splash that I would give it any thought. For me at least, F# is the only older than that language that has become interesting enough to learn.
I'm using Io in a serious project, it is the scripting language I chose to use for my games. Io's expressiveness and reflection has allowed me to make lots of mini-DSLs to simplify game scripting tasks. I'm very happy with the performance of its garbage collector; it's pauseless and I have profiled it as using only 10% of the cpu time while running a game. I think the vm code has been stable for some time now and the move to cmake cleaned up the build process. The only disagreement I have is over coroutines - they do weird things to the stack and they dont play nicely with C++ exceptions if you throw one and let it cross a coroutine boundary. I've started a project to port Io l's C code to C++ and replace the coroutines implementation. Most of my projects on github are related to Io in some way: http://github.com/dennisferron
I think Io is one of the most inventive languages I played with in a few years. Lots of interesting ideas and as _why said "it has a very clean mirror" that give you special powers.
I wouldn't recommend it if you want to get something done. It is an interesting way to explore the set of features they have chosen for the language(Prototype based OO, meassage passing, code easily modifiable at runtime). The point of the language seems to be more art than tool:
Io's purpose is to refocus attention on expressiveness by exploring higher level dynamic programming features with greater levels of runtime flexibility and simplified programming syntax and semantics.
Regarding the Joy programming language on your list, the author (Manfred von Thun) has stated that he no longer plans to develop Joy and recommends looking at Factor for a superior alternative.
The thing I never understand about these articles: why aren't C, Java, C++, C#, Ruby, Python and other mundane languages "worth checking out?"
I don't think you should assume people have used all of these, and they're all worth "checking out". Millions of people use these languages every day to create 99% of the software you're using right now. Maybe it's worth a shot if you don't know one of them to learn one and maybe even get a job with it. There's a lot more opportunity to learn Ruby and get a job in it than there is for Scala.
I think it is more assumed the many programmers know or are familiar with those languages. I know Java and Python. I have some limited experience with C, C++, & C#. I looked at Ruby. I would assume the majority of current programmers are in a similar situation.
The draw of these other languages like Factor, Io, Erlang, Haskell, and the other is that they are different. Erlang focuses on concurrency in a way that is not seen in the common languages listed above. Haskell is strongly typed functional language with lazy evaluation (I think I have that right). I don't know where Factor and Io fit it. By learning these other languages, I would hope to learn different ways to solve problems or to think about things. I would argue it isn't about learning them for more opportunities, but for a broader perspective in general.
The only reason to move from Java to C#, say, is if C# offers a library or tool Java doesn't, which is something external to the language. This is less true for Perl-to-Python moves, but Perl and Python aren't that different, either. You can bring the same conceptual toolkit to all of those mundane languages and get most of the same things done in the same amount of time.
Moving from any of those languages to Haskell or Erlang is going to turn your brain inside-out for a while and when the learning process is done you'll likely approach every other language a bit differently. That is a good reason to learn those languages.
There are a couple of languages I'd add to the list:
* Mozart/Oz - will (probably) change your thinking about concurrency.
* Clay - really pushes the idea of generic programming.
* Rust - typestate makes assertions part of the type system.
* Cilk - concurrency again.
BitC was looking quite interesting too, but I haven't heard anything about it for some time now; I hope the project hasn't died off.
fogus links to his list http://news.ycombinator.com/item?id=3083561 which I think is one of the better lists of this type since it covers the paradigm space more thoroughly than most such lists.
The only type of language I rarely seen mentioned are dependently typed languages like agda or epigram. maybe it is because they are not yet practical. They are an interesting directions things could take though. Fortress is another interesting one.
Another interesting language is Aldor. It's unique in that it has a weak form of dependent types and is a statically typed Computer Algebra/general programming language. Going through the types of the language is an education itself and a reasoning helper. While some take issue to the hierarchy it defines, it is the only one I know that has tried and is useful. The type provide some scaffolding for reinforcing the novice math person trying stuff out.
Haskell at the top of the list was one that I really enjoyed. It has a pretty scary learning curve but its wonderful having a language that feels so consistent.
I disagree. I think a lot of uninitiated potential haskell users are turned off by hearing about the "scary" learning curve. The key is to forget everything you know about imperative programming going in. There's a temptation to compare each Haskell idea with the similar idea in whatever language(s) you already know. Approaching with an open mind turns that "scary" learning curve into an easy and rewarding journey.
The scary learning curve isn't really a bad thing, I was never turned off by it, the main turn off for any programming language for me is going to be resources which is why I chose python over ruby (as python has in my opinion much better documentation and community).
As a disclaimer I've never really learned haskell yet, I do go through tutorials and solve some problems with it and everything now and again but I haven't fully grokked the language yet.
If you're interested in actually checking these out, 7 Languages in 7 Weeks (http://pragprog.com/book/btlang/seven-languages-in-seven-wee...) covers five of these (Haskell, Scala, Io, Clojure, Erlang) as well as Ruby and Prolog (Prolog is my favorite language from the book, oddly missing from this list). Apart from being a very good introduction to syntax, it teaches you the unique features of each language and why you should care about it. Highly recommended.
Try it. Io is really a joy to use. I don't use it for any professional work yet but I've used it to craft some little tools for my personal use. Really fun to use.
I skimmed the headings and was briefly interested in reading about this "Epilogue" language that I hadn't heard of before, but it turns out it was actually just an epilogue.
The aricle is OK, but I had to laugh with this starting quote:
"“The most obvious common ‘personality’ characteristics of hackers are high intelligence, consuming curiosity, and facility with intellectual abstractions. Also, most hackers are ‘neophiles’, stimulated by and appreciative of novelty (especially intellectual novelty). Most are also relatively individualistic and anti-conformist.” – Eric S. Raymond, The Jargon File"
Which reminds me of that immortal Reservoir Dogs line: "let's not start sucking each other's dicks quite yet"...
>Which reminds me of that immortal Reservoir Dogs line: "let's not start sucking each other's dicks quite yet"...
Thats the thing about the word "hacker", almost every single context where it is used, it's either used wrong, or as comes across as pompous bullshit. Including this website.
I agree, but only in that he limits the statement to hackers. All creatures are driven by novelty. Sometimes this makes it seem as as if we are "pleasure driven" but really it is novelty we seek. I have been thinking about this a lot lately and it answers the question as to why things like google, twitter, flickr, thrive: novelty aggregation.
The tool we used to write the supervisor code and GUIs for machine operators was a language called Alltalk. It was inspired by Smalltalk and developed by an old graybeard at the company. Reading the description of Squeak/Smalltalk brought back some fun memories. A few interesting highlights:
- Alltalk ran in its own VM. You could create objects, change their state, and save the entire stack/image back to the original VM. Running the VM again would pick up execution right where it left off. This let someone do crazy things like email an Alltalk image to another engineer and say, "Here's the machine state halfway through an emergency shutdown. The actuators all came to rest in 10ms, but it's taking way too long to shut down the hydraulic pumps. Any ideas?" and the engineer could run the VM and debug away.
- The Alltalk VM had its own object inspector and terminal/interpreter. So you could walk up to an Alltalk instance connected to a live machine with motors spinning at 20krpm, for example, open up the inspector, and code away. Realize you need a low-pass filter on the motor speed feedback sensor? Create the object, tweak the parameters, and wire it into the signal chain live.
Cowboy coding at its absolute finest.