Nice article (the Stanford Encyclopedia of Philosophy has many more, btw.), but I like the "Introduction to Lambda Calculus" by Henk Barendregt (one of the people working in Lambda calculus, see for instance the citation to his book "The Lambda Calculus: Its syntax and semantics." in the references), which has the added benefit of being free:
The Lambda Calculus is more widely applicable than most people know too.
One of the dominant modeling paradigms in formal semantics (e.g. understanding the meaning of human language) is built around a typed lambda calculus called Montague Semantics (see http://plato.stanford.edu/entries/montague-semantics/ )
Montague Semantics is really interesting — and totally inadequate! Still, it's a very good approach, and all formal systems for natural language semantics we know of today are still totally inadequate. It's proving to be very difficult to model natural languages as formal languages.
The basic idea of MG (and CCG) is to treat lexical items ("words") as function in the lambda calculus, and use higher order functions to compose these lexical functions, yielding truth conditions in a logical language (typically anything from predicate logic, over intensional or two-sorted type logic, to even higher order logic, dynamic logic, probabilistic logic, etc.)
Some of the more interesting extensions of traditional MG are introduced by Combinatorial Categorial Grammar, which, you guessed it, uses combinators to enable compositional analyses of lexical items even in complex syntactic constructions. One more thing that is very interesting is that continuation passing style transformations on these combinators and lexical functions seems to be rather effective! Read Barker 2004 for an overview.
It's a very interesting field of study, but almost entirely academic. There is very little commercial interest nowadays, as everybody is all about statistical NLP.
Absolutely, it is inadequate for modeling language, but then everything is! (My assertion is that so much contextual information is required to converse with humans that one needs to build a proper AI in order to have unrestricted conversation. That or you have to reliably constrain conversation to a ___domain)
The importance of systems like Montague Semantics are to kind of scope out what sort of thing we need to be able to model language in the abstract.
Statistical NLP is definitely way better for building engineering systems that are deployed to accomplish particular tasks.
Yes, just because something is incredibly hard, doesn't mean you have to stop pursuing it. We thought in the fifties and sixties that with rapidly increasing computing power, modelling natural would be within reach soon.
Oh, how wrong we were. Natural language (and human thought for that matter, because ultimately, AI and NLP might be two facets of the same problem) is so much more complicated than we imagined.
Researching human language and thought gives us insight not only sufficient to engineer interactive systems, but also to understand the human condition as a whole. Just take the entire discussion about rigid designators, and naming across possible worlds in intensional logics (read "Naming and Necessity" by Saul Kripke.) It is but one of the ways in which the need for a good formal approach to language resulted in an amazing philosophical discussion that isn't just about models, but our understanding of the world. Ultimately, natural language semantics can quickly transcend into deep philosophy. It sometimes takes me completely by surprise, actually :-)
If Church numerals’ unary representation bothers you, there is an efficient embedding of naturals in λ-calculus as linked lists of Booleans. Well, more efficient than the alternative…
Very nice article. The HTML version is free, the .pdf version which will be available after the 21st will cost you $10 a year---all things considered, a useful service at a very reasonable cost.
Very nice introduction indeed. But there are tons of good introduction to the lambda-calculus available for free in any languages that have a computer science degree taught in. And a lot of them goes into a lot more details in particular on typed lambda-calculi.
That $10 also gets you nice pdfs of some really excellent articles on a very large range of philosophical topics (I mean it is the SEP, after all). A worthwhile use of ten bucks, IMO. (I used to work for Zalta, but I'd have and express the same high opinion of the SEP regardless.)
I had never noticed how intension, used here in "intensional definition", was a different word than intention. I wonder what percentage of people who looked at this stuff outside a formal classroom haven't been confused by this?
The lambda calculus forms the backbone of most theories of programming languages. Functional programmers use enriched forms of it directly, but all others, at least the useful ones, do make use of lambda-calculus techniques.
The LC demarcates the border between high and low-level languages. High-level programming languages have reducible expressions, low-level ones are flat. HLPLs have notion of heirarchical variable scope, within which some variables are "bound" while others are "free" and escape to a parent environment; while LLPLs often have a flat dictionary of variables, if at all.
The lambda-calculus can also be higher-level than mainstream languages. Most PLs treat functions with identical bodies as different, due to name/pointer inequality. In the LC, two forms are equal if they reduce to the same basic form, and names of bound variables are insignificant.
http://www.cse.chalmers.se/research/group/logic/TypesSS05/Ex... (also covers typed lambda calculus AFAIR)