Hacker News new | past | comments | ask | show | jobs | submit login
Programming by poking: why MIT stopped teaching SICP (posteriorscience.net)
478 points by brunoc on May 4, 2016 | hide | past | favorite | 232 comments



Reading this made me so so sad, I do agree with the reasoning.

I learned to program on a course that follows SICP, I spent all my college years learning how to program from first principles, building all the pieces from scratch. Building compilers, soft threads implementations, graph parsing algorithms... and I was happy with that way of programming!

Today I'm an iOS developer, I spend most of my day 'poking' at the Cocoa Touch framework whose source I can't even read because it's closed source. The startup I work for moves so fast that I'm forced to use other peoples open source projects without even having the time to read the source. I feel miserable doing the kind of programming I do nowadays! I wish I could go back to simpler times, where I actually had to reason about algorithms, data structures, hardware, etc...


I might be canned for this here, but, I see that enterprises have a lot more work of the fundamentals, simpler sort than startups. At startups, I worked with 100s of libraries not understanding, or not having the time to even look back at things we did. At enterprises, there are a lot of low hanging fruit, which when applied magic CS pixie dust, becomes more performant (to a lot of people).

This is one of the reasons why I moved away from product side programming to IT. My brain screams in pain, when I do lot of context switches, and I figured this is not going to make me a better programmer. The fun stuff is in handling data, not learning layers of apis(for me).

Now working in enterprise is not for everyone - the politics etc, but for me, it still beats the pain of working in a smelly, loud, fast context-switching, agile-kanban startup.


    I feel miserable doing the kind of programming I do 
    nowadays! I wish I could go back to simpler times, where 
    I actually had to reason about algorithms, data 
    structures, hardware, etc...
I don't subscribe to the school of thought that values engineers lower on the stack more than those higher up, especially since there seem to be a lot more new jobs of the latter sort and we all need to make a living, and there are plenty of cool problems in those spaces (algorithms, data structures, performance... it's all there).

But I think the lucky ones are people who get to work low enough relative to their knowledge where it doesn't feel they are dealing with endless abstractions and layer upon layer of magic.

    The startup I work for moves so fast that I'm forced to 
    use other peoples open source projects without even 
    having the time to read the source.
:( sounds to me that this might be more the problem. And worse is I suspect it's common.


    I don't subscribe to the school of thought that 
    values engineers lower on the stack more than 
    those higher up
Meaning developers who understand low-level details vs. developers who just wire up high level libraries?

I've done the full range, from entire games written in assembly language and embedded C code, through high level full stack development with NodeJS, Python, and other languages.

The low-level coding is far more skilled work than the high level coding. Instead of most of the work being "how do these plug in to each other?", it's how exactly does my code function with respect to all of the potential events additional threads, and how all of the edge cases interact, and what is causing this obscure bug?

While that may not seem intrinsically harder, none of these are typically something you can Google or search StackOverflow for the answers on. So you're really just on your own. And developers who have grown up on "Google all the answers!" often hit a wall when they need to apply real problem solving skills.

Luckily I can find enjoyment in many levels, since a lot of the jobs I've found recently have been more of the "full stack" or "mobile development' category. It's easy and fun work.

I also have little problem piercing the magic and understanding how things fit together, but that means that I end up with opinions on many topics divergent with the crowd. For instance, I avoid writing Android/Java or iOS/Swift, and instead use cross-platform development stacks exclusively. Yes it means an entire extra layer of complexity, but it also means I write the app once for all target platforms. Far too big of a win to ignore.


> The low-level coding is far more skilled work than the high level coding. Instead of most of the work being "how do these plug in to each other?", it's how exactly does my code function with respect to all of the potential events additional threads, and how all of the edge cases interact, and what is causing this obscure bug?

I've done the full range too, and I don't agree that low-level involves more skill. I think it involves different skills. When you're working high-level, you don't have the above questions so much, to be sure. Instead you have questions like, what are the relevant business rules? Which ones are stable and which ones are going to change? Does this user interface suck? Is the program giving answers that actually correspond to reality? Does anyone want this product in the first place? Different questions, but not easier to answer.


> While that may not seem intrinsically harder, none of these are typically something you can Google or search StackOverflow for the answers on.

From my multi-threaded background - The problem is, the code looks simple and the code actually doesn't get more complicated. Usually, the code you need to write in 90% of the time actually becomes easier to write for most people, because they can work in rigidly defined boundaries.

However, explaining why this part of the code base is safely threaded and scales vertically across multiple cores, and how we avoided multiple non-obvious bottle-necks, and the optimization the software 3 services in the past matters... that's going to take hours. And at that point, we haven't touched lock-free operations yet.

And after that, understanding how to design an architecture from scratch, how to interact with the network and the kernel properly, how to enable distribution of the application in an elastic manner... yeah. Let's get a room.

And these kind of systems are exactly the systems which benefit from the SICP-approach. Thread pools, message queues, databases, the network, lock-free data structures all are well-understood components we can combine in structured manners.


Sorry, that way you do not write an app for all of the platforms. You write for none of them it just happens to run on one of them.


I test on iPad, iPod Touch, a half dozen Android devices, and on emulators. Works well everywhere, even with platform-specific differences in design to match different platform paradigms. And I have happy clients.

I've run into this attitude before, though. It seems to be the last bastion of denial of folks who have spent a lot of time learning one platform and who don't want to learn something new. Sorry, but there are better ways.


Err, what do you mean? I once wrote the embryo of a game with Qt, it ran on 3 environments (GNU/Linux, Windows, Android). What to I care that I write for "none" of those platforms if it "happens" to run on all 3?


I think he doesn't necessarily imply anything about being high or low in the chain. But about beauty and understanding.

SICP is one example, but I bet it's even possible to be happy coding GUIs or any other rather mundane things if one does it with an elegant toolset. For example, Oz comes to mind.

Some modern tools are not pleasing to use, often because architectures are a mess. Things move forward too quickly.


What's Oz?



A worse problem would be simulating linux in your head as you methodically review the source.


I was trained on Scheme 20 years ago. Scheme was the moment when I felt I "got" programming right (before that I did lots of Pascal, basic, assembler,etc).

Now I do Python. And with Python I "got" something too : writing tools, coding, etc. is not like mathematics (like Scheme) anymore. It became a social activity. I'm in the field of ERP right now (arguably not the most theory-oriented stuff). It's because I spend most of time using API's by other and making API's in the hope that they will be used by other (none of those API being worth much in terms of computer "science" (stuff like algorithm). I'm also building tools to augment productivity of other people; which is also quite social.

So from the abstract Scheme programming, I've moved to Python social programming. That keeps me happy (I must confess that studying Voronoi diagrams, or 3D structures packing remain what made me really tick :-) )


And to make this really annoying, you still need to know low-level algorithms and data structures to make it through most interview processes. What they should really do is give you a crappy API doc and have you make it do something useful.


You're just applying at wrong companies.


Please take advantage of your job mobility. Changing jobs in our line of work isn't as simple as flipping a switch, but you still are in so much demand. There's no need for you do engineering work you're dissatisfied with.


> I feel miserable doing the kind of programming I do nowadays! I wish I could go back to simpler times, where I actually had to reason about algorithms, data structures, hardware, etc.

Join the TXR project:

http://www.nongnu.org/txr

http://www.kylheku.com/cgit/txr/tree/

I made some 3000 commits all by myself since 2009. The main reason why this exists is the above: a project where I can stuff ideas, algorithms, whatever, into a coherent whole that has regular releases and is well-documented externally. And this coherent whole is useful to me in many ways.


TL;DR but it reminds me of Awk...


Awk also has an input scanning sublanguage which implicitly marches through data, and that language can evaluate expressions and functions in another sublanguage. That's where the analogies end.


>> moves so fast that I'm forced to use other peoples open source projects without even having the time to read the source

This.

Moving fast, sprinting, only looks like you are making progress. At some point the debt comes too much


In a startup context, the trick is to make sure the technical debt comes due after the point at which you find out whether you've got a product people actually want in the first place.


Have you considered getting into embedded development? You tend to be dealing with much lower level code and closer to the hardware. It's fun :)


Thanks for this comment. Actually I have considered embedded a lot. I love C, specially free standing C.

I will probably move to embedded soon. Problem is I have lot of experience with iOS but not that much experience with embedded, I'm willing to get a pay cut just to get out of iOS but there's a limit of how low I can go.

There's also a lot more iOS jobs than embedded jobs!

edit: Other than C, I'm very curious about rust and nim-lang. I do not know C++ though. Hope that won't be a problem.


Yes making career moves from one technical field to another can be difficult - I hope you manage to find somewhere that'll take you on without putting you over a barrel.

I've found that it entirely depends what level of embedded development you want to enter - for instance I mainly work with C++ and MSQL databases on our platform - because our main device runs linux this makes sense. But other related products that we sell run on bare metal and are written in pure C.

Having a strong knowledge of C will most certainly get you far in this field, but perhaps learning the ins and outs of C++ wouldn't be such a bad idea if you have the spare time - the more niches you can fit in the easier it will be to pick up that job you really want.


Serious question: What is stopping you from going back?

I'm naive but curious and I do not want to make guesses.

I wonder how are we going to preserve knowledge about programming from first principles if, under pressure from corporations and lazy peers, no one does it anymore?


It's mostly that I got 'stuck' with iOS. Once you specialise it's hard to get out. If I change jobs now for anything other then iOS, I have to take a pay cut and I need to find someone that will hire me without any previous experience in... embedded, for example.

There are also a lot more of this 'poking' jobs when compared to first principle coding jobs. Specially for someone like me who never worked for huge corporations.


Not _that_ hard. Gradual transition should work.

If you like videogames, learn some OpenGL ES, apply to mobile game development position, if you’ll like the area, you’ll be able to continue career developing games for other platforms. If you’d like to move to MS stack, learn some .NET, then look for Xamarin iOS jobs, you’ll be able to continue career developing .NET for Windows. If you enjoy hacking, seek a job in a company that makes UI test automation tools for iOS, or iOS-related security software.

You already have experience dealing with all those iOS weirdness, you might be able to minimize or even negate that pay cut.


I think nowadays the most important business in a company is to built things atop fundamental things. For example, we build apps atop Cocoa, web atop ror. That is the business stop us from going back.


Well, to be honest, that's why there are no iOS apps worth a god-damn. We got a bunch of junk as apps (programs now). So much wasted effort and time.


Closed source? Decompile the fker.


Sure. That makes it easy to reason about the code, see the abstractions clearly (at least as clearly as the author intended), make minor changes and submit patches upstream, recompile it with different options, ...

Oh, wait.


If you don't have time to read the open stuff, you probably don't have time to pick apart the closed.


I'm sure the lawyers in your company will be really happy to hear that.


And why exactly you're wasting your skills on something as pitiful as this? There is a lot of work out there for people who can build complex solutions from the first principles.


Even piecing together systems out of existing components significantly benefits from knowledge of first principles.


Maybe. But he obviously feels miserable doing this. What's the point of being miserable while there is a lot of much more fun things to do?


I'm surprised and a bit dismayed to read Sussman's reasoning:

"...Sussman said that in the 80s and 90s, engineers built complex systems by combining simple and well-understood parts. The goal of SICP was to provide the abstraction language for reasoning about such systems.

Today, this is no longer the case. Sussman pointed out that engineers now routinely write code for complicated hardware that they don’t fully understand (and often can’t understand because of trade secrecy.) The same is true at the software level, since programming environments consist of gigantic libraries with enormous functionality. According to Sussman, his students spend most of their time reading manuals for these libraries to figure out how to stitch them together to get a job done. He said that programming today is 'More like science. You grab this piece of library and you poke at it. You write programs that poke it and see what it does. And you say, ‘Can I tweak it to do the thing I want?''. ... "

Surely graduates of MIT, of all places, would be the ones building the libraries that ordinary programmers are stitching together and would still need the ability to build complex systems from fundamental CS abstractions? I'm absolutely baffled.


>Surely graduates of MIT, of all places, would be the ones building the libraries that ordinary programmers are stitching together and would still need the ability to build complex systems from fundamental CS abstractions? What's going on over there?

Ordinary programmers everywhere are building those libraries, just like your assumed wunderkind are building programs out other people's libraries. The nature of programming has changed for >everyone<.


Yes, but the reason it has changed for everyone is that no one understands the fundamentals any more, because they aren't being taught to anyone. The ecosystem is thus becoming infested by horrible hacks which kinda-sorta work, and which everyone uses, because they kinda sorta work, and there is nothing else. The idea that programmers need to understand how to "program by poking" thus becomes a self-fulfilling prophecy.

[UPDATE:] One of the symptoms of no one understanding the fundamentals is how excited people get about things like XML and JSON, both of which are just (bad) re-inventions of S-expressions.


The ecosystem is thus becoming infested by horrible hacks which kinda-sorta work, and which everyone uses, because they kinda sorta work, and there is nothing else.

Yes, but the reason it has changed for everyone is that no one understands the fundamentals any more, because they aren't being taught to anyone.

I mostly agree with both of these statements, but with a slight twist. To some extent, I feel like the reason that "no one understands the fundamentals any more" is simply because the stack has gotten too deep (or the field has gotten too large if you'd rather visualize it that way). That is, no one has time to learn everything all the way from NAND and NOR gates, up to 7400 series IC's, to microprocessors, to assembly language, to C (portable assembly), to operating systems internals, to TCP/IP, HTTP internals, to HTML, and ultimately to Javascript, and also including a side order of databases, filesystems, machine learning/AI, etc.

As some point people just have to start treating some lower layers as black box abstractions, so they can actually work. Of course it is always beneficial to try and learn some of the lower level fundamentals, but I just don't see any that for everybody to have full knowledge of an entire application stack from end to end.

To me, the best you can do is include fundamentals in an "always keep learning" mindset. That is why, for example, I am still working on learning assembly language (x86/x64) in idle bits of spare time here and there, at the age of 42 and after 20+ years of programming. And why I still go the hackerspace and build circuits from discrete components and low level IC's for fun. But as it happens, for most of my career, not knowing assembly or how to assemble a microcomputer from individual ICs has not prevented me from getting useful stuff done.


> no one has time to learn everything

Nonsense. The fundamentals don't take a long time to learn. And once you know them, everything else becomes much easier to learn. That's the reason that learning the fundamentals matters: it's a huge lever.

Here's a single, small, very accessible book that takes you all the way from switches to CPUs:

http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...

SICP gets you from CPUs to Scheme (well, it goes in the other direction, but the end result is the same). That's two books to get from switches to compilers. Anyone who thinks they don't have time for that needs to learn to manage their time better.


Yes, learning the fundamentals is a huge lever. I absolutely agree. But I still stand by the assertion that "no one has time to learn everything" - especially at the beginning of their career.

As the old saying goes "if I had 3 days to cut down a tree, I'd spend the first 2.5 days sharpening my axe". Sure, but at some point you have to actually start chopping. By analogy, at some point you have to quit worrying about fundamentals and learn the "stuff" you need to actually build systems in the real world.

By and large I'm in favor of spending a lot of time on fundamentals, and being able to reason things out from first principles. And when I was younger, I thought that was enough. But the longer I do this stuff, and the larger the field grows, the more I have to concede that, for some people, some of the time, it's a smart tradeoff to spend more of their time on the "get stuff done" stuff.


especially at the beginning of their career.

That's why you spend four years in a university before you start your career. If you're not going to learn all the fundamentals, you might as well go to an 8 week code camp and save your time.


But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful. So what do we do, have people do a 4 year degree, and then go spend 8 weeks, or 16 weeks, or a year, learning to actually build systems?

I don't know, maybe that is the answer. But I suspect the MIT guys have a point in terms of making some small concessions in favor of pragmatism. Of course, one acknowledges that college isn't mean to be trade school... Hmmm... maybe there is no perfect answer.


> even 4 whole years isn't enough to learn "all the fundamentals"

No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental." It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn.


I disagree. The field has exploded. It's becoming more and more difficult to take vertical slices of every sub-field. What should we consider fundamental?

Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

There are plenty of core fields that I've missed. Which are fundamental? Which do we teach? It simply isn't possible to adequately teach everyone the core fundamentals of all of these fields during the course of an undergraduate degree while also conveying the basic design fundamentals that a software developer needs to know. There is just too much in the field to expect every software developer to have a complete understanding of all of the fundamentals in every sub-field. Our field is getting a lot wider and a lot deeper, and with that, people's expertise is forced to narrow and deepen.


There is the actual complexity, and then there is the accidental complexity lamented by the poster to whom you responded to. I would claim both are a thing. Especially in projects where the true complexity is not that great and the theoretical basis of the solution is not that well documented people have a tendency to create these onion layered monstrosities (the mudball effect).

If we look just traditional CRUD-kin (i.e. database, view, ___domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.


> If we look just traditional CRUD-kin (i.e. database, view, ___domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.

Perhaps because this type of program is so old, it had so much time to stick lots of mud on it. :-)


> What should we consider fundamental?

A fair question, and a full answer would be too long for a comment (though it would fit in a blog post, which I'll go ahead and write now since this seems to be an issue). But I'll take a whack at the TL;DR version here.

AI, ML, and NLP and web design are application areas, not fundamentals. (You didn't list computer graphics, computer vision, robotics, embedded systems -- all applications, not fundamentals.)

You can cover all the set theory and graph theory you need in a day. Most people get this in high school. The stuff you need is just not that complicated. You can safely skip category theory.

What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically. This is the part that the vast majority of people are missing nowadays, and it can be a little tricky to wrap your brain around at first. You need to understand what a fixed point is and why it matters.

You need automata theory, but again, the basics are really not that complicated. You need to know about Turing-completeness, and that in addition to Turing machines there are PDAs and FSAs. You need to know that TMs can do things that PDAs can't (like parse context-sensitive grammars), and that PDAs can to things that FSAs can't (like parse context-free grammars) and that FSAs can parse regular expressions, and that's all they can do.

You need some programming language theory. You need to know what a binding is, and that there are two types of bindings that matter: lexical and dynamic. You need to know what an environment is (a mapping between names and bindings) and how environments are chained. You need to know how evaluation and compilation are related, and the role that environments play in both processes. You need to know the difference between static and dynamic typing. You need to know how to compile a high-level language down to an RTL.

For operating systems, you need to know what a process is, what a thread is, some of the ways in which parallel processes lead to problems, and some of the mechanisms for dealing with those problems, including the fact that some of those mechanisms require hardware support (e.g. atomic test-and-set instructions).

You need a few basic data structures. Mainly what you need is to understand that what data structures are really all about is making associative maps that are more efficient for certain operations under certain circumstances.

You need a little bit of database knowledge. Again, what you really need to know is that what databases are really all about is dealing with the architectural differences between RAM and non-volatile storage, and that a lot of these are going away now that these architectural differences are starting to disappear.

That's really about it.


> You need automata theory... Turing-completeness... PDAs and FSAs...

Why? I know that stuff inside and out, and across multiple jobs I have used none of it ever. What chance to use this favourite bit of my education did I miss? (Or rather, might I have missed, so that you might speak to the general case)

> You need to know how to compile a high-level language down to an RTL.

Why? Same comment as above.

> You need to understand what a fixed point is and why it matters.

Well, I don't, and I don't. I request a pointer to suitable study material, noting that googling this mostly points me to a mathematical definition which I suspect is related to, but distinct from, the definition you had in mind.

Otherwise... as I read down this thread I was all ready to disagree with you, but it turns out I'd jumped to a false conclusion, based on the SICP context of this thread. Literacy, what a pain in the butt.

In particular, the idea that ((the notion of an environment) is a fundamental/foundational concept) is a new idea to me, and immediately compelling. I did not learn this in my academic training, learned it since, and have found it be very fruitful. Likewise with lexical vs dynamic binding, actually.


>> You need automata theory... Turing-completeness... PDAs and FSAs...

> Why?

So you can write parsers. To give a real-world example, so you can know immediately why trying to parse HTML with a regular expression can't possibly work.

>> You need to know how to compile a high-level language down to an RTL.

>Why?

So that when your compiler screws up you can look at its output, figure out what it did wrong, and fix it.

>> You need to understand what a fixed point is and why it matters.

> Well, I don't, and I don't. I request a pointer to suitable study material

http://ocw.mit.edu/courses/electrical-engineering-and-comput...

Particularly lectures 2A and 7A.


I'm of two minds about this. Everything you mention is great background to have. (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I think this deep background is a great goal. But in another way programming is a craft. You can learn as you go. There are millions of bright people who could do useful, quality work without an MIT-level CS education. They just need some guidance, structure, and encouragement.


> (Though non-trivial programs can't be reasoned about mathematically much more than biological systems can).

I wouldn't be so sure. I once applied for a company that was about proving various safety properties of control and signalling applications (for trains). Sizeable applications. They have problems with the halting problem and combinatorial explosions, but they get around those and do it anyway.


A very vaguely related question: are bindings lexical or dynamic in R? Or would it be fair to say that it's actually both at the same time? Or do we need a new term altogether?

For those unfamiliar with it, in R, environments are first-class objects, and identifier lookup in them is dynamic. But the way those environments are chained for lookup purposes is defined by the lexical structure of the program (unless modified at runtime - which is possible, but unusual) - i.e. if a function's environment doesn't have a binding with a given name, the next environment that is inspected is the one from the enclosing function, and not from the caller. So R functions have closure semantics, despite dynamic nature of bindings.

It would appear that this arrangement satisfies the requirements for both dynamic scoping and for lexical scoping.


> AI, ML, and NLP and webdesign are application areas

On first thought, I do agree. However, they are fundamental applications. Category Theory is categorizing the fundamentals. It uses a lot of the fundamentals on itself, I guess, but that doesn't mean ti me it's inaccessible or useless.


Don't confuse "important" with "fundamental". He probably meant foundational to begin with.

The web for instance is an unholy mess. We can do better for mass electronic publishing. We don't because that huge pile of crap is already there, and it's so damn useful.


Webdesign IMHO is be an extreme example of formatted output. I/O is a fundamental concept.


>What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically.

Yes please.


I don't know about your university, but mine at least some coverage of all those categories.

At a minimum an education should give you a strong enough base that you can teach yourself those other things should you so desire.


> Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.

That was exactly the curriculum of my CS degree, minus the web design, and I didn't even go to a first rate CS program like MIT (PennStateU 20 yrs ago, no idea what the curriculum is now.).


" It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn."

Thanks, you succeeded in verbalizing succinctly the agony of so many software projects. However, I would claim it's just not ignorance of the fundamentals. It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems. I call it cargo-cult progress.


> It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems.

Spolsky calls these 'architecture astronauts' (http://www.joelonsoftware.com/articles/fog0000000018.html). (As a mathematician, I have a certain leaning to this mindset, so I clarify that I am not criticising it, just reporting on existing terminology.)


It is absolutely essential to have a theory of system while implementing it. The software system itself, however, should exhibit only a simple subset of the whole theory in conscise form as possible. Because - usually the full theory becomes obvious only while writing the software. And, in practice, one uses only a tiny part for the implementation at hand.

I think one facet of this thing is that people have an ambition to build huge systems that encompass the entire universe - while in fact, most software system only need to utilize the already excisting capabilities in the ecosystem.

It's like since people are eager tinkerers they approach software projects with the mindset of miniature railroad engineers - while in fact, the proper way to attack the task should be as brutally simple as possible.

The reason huge theoretical systems are a problem in software engineering is that a) they tend to be isomorphic with things already existing b) while not implementing anything new and c) they obfuscate the software system through the introduction of system specific interface (e.g. they invent a new notation of algebra that is isomorphous in all aspectd to the well known one). And the worst sin is, this method destroys profitability and value.


Amen.

How could they ever understand "simple and easy"? Their concept of simple is not based in reality.

There seems to be this idea (I wish it had a name) that one can just apply a new abstraction layer and the lower layers magically disappear or are rendered insignifcant.

And of course producing 100's or 1000's of pages of "documentation". This is disrespectful toward the reader's time. The best documentation is and will always be the source code. If I cannot read that, then for my purposes there is no documentation.

This is not to say some of these old new thing higher-level of abstraction solutions do not work well or are not easy to use or do not look great. Many people love them, I'm sure. But until I see more common sense being applied it is just not worth my time to learn. I would rather focus on more basic skills that I know will always be useful.


No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental."

I think what we actually disagree about is just the exact definition of "fundamentals". I personally would not say "the fundamentals are very simple and easy to learn". At least not if you're learning them to the level of depth that I would want to learn them.

Now if we're just saying that people only need a very high level overview of "the fundamentals", then that might swing the equation back the other way. But that, to me, is exactly the knob that the MIT guys were twiddling... moving away from SICP and using Python doesn't mean they aren't still teaching fundamentals, it just sounds like they aren't going quite as deep.

Anyway, it's an analog continuum, not a binary / discrete thing. We (the collective we) will probably always be arguing about exactly where on that scale to be.


> I think what we actually disagree about is just the exact definition of "fundamentals".

That may well be, but as I am the one who first used the word into this conversation (https://news.ycombinator.com/item?id=11630205), my definition (https://news.ycombinator.com/item?id=11632368) is the one that applies here.


Fair enough... from that point of view, I think we agree more than we disagree.

And please don't take anything I'm saying here as an argument against learning fundamentals.. all I'm really saying is that I can understand, even appreciate, the decision MIT made. Whether or not I entirely agree with it is a slightly different issue. But I will say that I don't think they should be excoriated over the decision.


> But that's exactly the point... these days, even 4 whole years isn't enough to learn "all the fundamentals", at least not while balancing things so that you learn enough practical skills to also do something useful

A minor point to make here, college isn't about learning practical skills; that's what trade schools and on the job training do. College is about learning the fundamentals, however, computer science is not software engineering and you don't need learn computer science to be a good software engineer because software engineers don't do much in the way of computers science anyway.


A minor point to make here, college isn't about learning practical skills;

Yes, on paper that is true. And I alluded to that point in my answer above (or at least I meant to). But from a more pragmatic and real world point of view, colleges are at least partially about teaching practical skills. I don't know the exact percentage, but I'm pretty sure a large percentage of college students go in expecting to learn something that will pointedly enable them to get a job. Of course one can argue whether that's a good idea or not, and I don't really have a position on that. But I don't think we can ignore the phenomenon.


>> "no one has time to learn everything" - especially at the beginning of their career.

I wish I had this book at the beginning of my career. http://www.amazon.com/Elements-Computing-Systems-Building-Pr.... Makes you design the hardware, then the software for that hardware.

Should not take more than 8 - 12 weeks with school work/day job.


"no one has time to learn everything"

right. No one has time to learn endless js frameworks, injections and reinventions of the wheel. So (1) read the fucking SICP book, (2) learn about the business problem you are trying to solve, put 1 and 2 together and "get stuff done".


This is what co-op programs address. 5 year degree, learn all the fundamentals from silicon to applications, with 6 co-op placements of 4 months each interspersed throughout. That was the Waterloo formula when I went through their CS program and it works tremendously well. Sure, it's a lot to learn in 5 years, and there's always more to learn, but it does give you a very solid foundation to build on.


> SICP gets you from CPUs to Scheme

I don't recall anything about CPUs in SICP. Its more about data driven programming and writing of intepreters.

What i liked about SICP and scheme programming was that it is a pretty good environment for tinkering - the REPL makes it easy to combine functions, and to work in a bottom-up manner. (btw you had less of that in common lisp, and most other environments that teach you to work in a top down way, however you can still work with the Python REPL).

Maybe this bottom-up approach is what Sussman really has in mind when he is talking about first principles, because SICP is really not about working up from the atoms to the scheme interpreter/compiler.


> I don't recall anything about CPUs in SICP.

https://mitpress.mit.edu/sicp/full-text/book/book-Z-H-30.htm...


yes, the compiler target. I stand corrected.


I bought this book for my son who will be starting a CS program in the fall. He seems to have enjoyed it, and I'm hopeful it will give him a good grasp of the fundamentals that you might miss by starting out with Java.


> learn everything all the way from NAND and NOR gates, up to 7400 series IC's, to microprocessors, to assembly language, to C (portable assembly), to operating systems internals, to TCP/IP, HTTP internals, to HTML, and ultimately to Javascript, and also including a side order of databases, filesystems, machine learning/AI, etc.

It's called a Computer Engineering degree, sometimes called EECS (as at MIT). I did it and you can too. The Javascript and HTML were self-taught, admittedly, because they're the easiest parts of that list.


It's called a Computer Engineering degree, sometimes called EECS (as at MIT). I did it and you can too.

And by your own admission you didn't learn all the stack elements I listed above. And that was not in any way, shape, form, or fashion an exhaustive list!

Sure, if you finish a four year degree in CS/EE/EECS you learn a lot of stuff... and if you spend a big chunk of that four years on the really low level stuff, you have to tradeoff time spent on higher levels of the stack. You can only pour so much water in a 1 gallon bucket.

And even then, you only get the fundamentals are a certain level of depth. At some point, one has to ask "how important is it that I be able to go out, buy an 8088 chip, a memory controller chip, a floppy drive controller chip, etc., solder a motherboard together, code an OS in assembly, write my own text editor, etc, etc., etc."

Don't get me wrong. I'm not against teaching the fundamentals, and I'm not even sure I agree with MIT's decision on this. But I will say I can understand it and cede that it has some merit.

And all of that said, I'll go back to what I said earlier.. to me, the important thing is to continue learning even after college, including going back to fundamentals like building circuits from individual transistors and what-not. That stuff has value, you there's no reason to think you can't be productive even without that.

I mean, if you think about it, every field eventually segments into layers where certain practitioners treat some things as a black box. Does an engineer building a car also need to be a metallurgist or materials scientist? No, he/she just grabs a book, and looks up the parameters for the correct material for the application at hand. Etc.


For those that want to learn that information in a structured way, but don't want to accumulate debt doing it, there are some online resources that are meant to be pretty good. I've heard good things about NAND2Tetris, for example, would be interested to hear if anyone here has given it a go.

http://nand2tetris.org/


It's not because no-one understands the fundamentals, it's that we require so much functionality to be built in such a short space of time and can afford so much processor time for it, that there's no time to build everything from fundamentals. Thirty years ago you'd build a text editor or modem control interface into your program. Today you need to embed an entire web browser, SQL database, AAA-level game engine etc. Most 'trivial' stitch-some-libraries-together software built today would take decades to create from scratch.


People were excited about JSON because there was a desperate need for a data format that wasn't overly complex and/or unsafe to parse.

S-expressions just weren't up to snuff because there's no standard way to do key-value mapping.


Sigh. This is exactly the kind of lack of understanding that I'm talking about.

> S-expressions just weren't up to snuff because there's no standard way to do key-value mapping.

That's not true. There are two standard ways to do this: a-lists and p-lists.


"Two standards" isn't much different from "no standard" in this context. JSON was successful because it had a simple spec that almost anyone could implement for any language with virtually no ambiguity. It's not ubiquitous because it's good, it's good because it's ubiquitous. It's another episode in the long history of "worse-is-better".


Formally, I think this to be a good list of fudamentals: programing paradigms, algorithms, data structures, compilers, operating systems, networking, math for CS.

What else would you recommend adding to the fundamentals list?


Digital logic and computer architecture.


Heh, it is also a consequence of the web limitations (can't easily replace javascript, or anything else without breaking backwards compatibility) and its associated technologies. You can only poke the browser, not understand it.

I find funny that only last year with MVC scaffolding generators I can do forms to access a database just as easily as I was doing them in Foxpro 20 years ago.

Yes, now it is client server, safe against common hacking attacks, responsive, etc. But it's been 20 years !

And the productivity valley between both points is abysmal.


I'm not sure that it's because no one understands the fundamentals any more. (Which, BTW, the latter part I agree with.) I think it's because everything's gotten too large. You need to use too many frameworks and too many libraries just to get anything done today, and there's simply too much code there (assuming you're even allowed to look at the source!) to comprehend it.

I don't like it, but I think it's an unavoidable consequence of computing's evolution into ubiquity.


That's because at some point the industry lost the plot.

The fundamentals are fundamental. They don't change very fast.

Applications change all the time. So do frameworks. But there's very little genuinely new in most frameworks, or in most languages for that matter. They're mostly repackagings of the same few ideas.

Which is why it's a lot easier to pick up applications and do a good job with them if you know the fundamentals than if you're hacking away without any contextual or historical understanding.

Meanwhile someone is going to have to do the next generation of pure research, and it's a lot harder to do something creative and interesting in CS if all you've ever known is js, Python, and Ruby.

The reality is that software quality is decreasing. Never mind maintainability or even documentation - applications are becoming increasingly buggy and unreliable.

It's common in the UK now for bank systems to crash. Ten years ago it was incredibly rare, and twenty years ago it was practically unthinkable.

Software is too important to be left to improvisation and busking. So "just learn to make applications from other applications" is not a welcome move.


I wonder how much of this is due to the idea: 1) non-technical users need to be able to use our software. How much code today is about preventing users from doing something they should know better? 2) we're less and less able to say "no, that goes way beyond the project's scope" to our bosses. Our bosses will quickly reply "yeah well Google's ______ does it, so why can't we?".


this is due to the idea that modern sw developers do not care what they do, but care a lot about how they do things. They pay no attention to the business problem at hand (in my experience they get very upset when I try to draw their attention to the business requirements) but spend all their time discussing what *-pad framework is the best candidate.


A few years when MIT switched from SICP/Scheme to Python, Sussman had this to say:

"I asked him whether he thought that the shift in the nature of a typical programmer’s world minimizes the relevancy of the themes and principles embodied in scheme. His response was an emphatic ‘no’; in the general case, those core ideas and principles that scheme and SICP have helped to spread for so many years are just as important as they ever were"

From: https://cemerick.com/2009/03/24/why-mit-now-uses-python-inst...

If anything, I would think Sussman is more practical, and understands what the world needs/expects(now).

Literally any programmer who hasn't read SICP before will benefit from it. I think the principles still apply.


In the context of the class (The Structure and Interpretation of Computer Programs), I feel like this is not such an outlandish view. It sounds to me like the field of software engineering has simply evolved since the 80s.

Don't get me wrong, if you are going for post-graduate studies such a course will always be relevant, but it sounds like he is talking within the context of undergraduates. And in the context of undergraduates, I too would be circumspect of how useful it would be for preparing you for your first job as a Software Engineer.

Their choice to go toward a Python-based course at the undergraduate level would also seem to reaffirm this view from afar...


> It sounds to me like the field of software engineering has simply evolved since the 80s.

What is ridiculous in the face of this "programming by experimentation" fantasy is that programming has evolved since the 1980s... to be even more about composable abstractions with provable semantics. Hindley-Milner-Damas types and monads are now everywhere.


> Hindley-Milner-Damas types and monads are now everywhere.

Haven't run into those. Perhaps I know them by a different name?


Most likely you have. Or simply something very heavily inspired by either.


Can you expand on the last sentence. I'm not sure I understand what you were trying to express. (not trolling, genuinely curious)


The application of mathematical type theory (https://en.wikipedia.org/wiki/System_F) to popular programming languages goes back to 1998 when Philip Wadler designed generics for Java.

Local type inference is now used in Visual Basic, Scala, Rust, probably a lot of other new languages I am missing. Gradual types are coming to Clojure and probably Python and Lua.

Erik Meijer did a lot of work on bringing monads and FRP as patterns to .NET programmers. Java 8 has monads (Optional and Stream interfaces). Bartosz Milewski has been getting a lot of attention in C++ circles (see his blog: https://bartoszmilewski.com/)


Its application to unpopular languages goes back farther, of course.


Great, thanks for the clarification.


MIT grad here: Nope. I've never had a workday that didn't involve reading through the docs or source of a library I didn't write. I am genuinely grateful to have taken 6.01, the course that replaced 6.001.

I am also grateful to have taken the condensed version of 6.001. You do need the ability to understand those abstractions in order to be an informed shopper though.


Your assumption about who writes libraries and who uses whose code is broken.


I think he meant to say "should".


The point still stands if you add a couple of shoulds to my response as well.


>Surely graduates of MIT, of all places, would be the ones building the libraries that ordinary programmers are stitching together and would still need the ability to build complex systems from fundamental CS abstractions?

"MIT" doesn't necessarily mean "super-elite programmer". I work in an office that's half MIT grads, and the non-MIT half are pretty much equivalently good (though with much worse connections around Cambridge). That's not to say MIT sucks or anything, but more to say that with luck, a really solid CS or EECS degree gets the student up to being able to build important components from scratch at all, which isn't necessarily the level needed to build those components from scratch for public release or for profit. That latter goal requires a good education followed by professional training and experience.


>libraries that ordinary programmers are stitching together

I watched the SICP videos, and I remember Abelson specifically endorsing just that.


Calling it his "reasoning" with all the connotations that come with that word goes way too far.

This was his polite implicit criticism of the new core, which among other things also teaches much less in the way of EE fundamentals, a topic he's cared about very much since at least the late '70s (i.e. 6.002 is no longer a required course for all EECS majors).

The bottom line is that in the post-dot.com crash panic which saw MIT EECS enrollment drop by more than half, after it had been steady at 40% of the undergraduates for decades, one result was "MIT" deciding that what a MIT EECS degree meant would be substantially different.


It may have been that it just took 7 years to actually get a new course in place, but it wasn't until fall 2007 that MIT officially got rid of 6.001 as required course, well after the dot com crash.

There were a TON of changes that happened with the MIT EECS curriculum at that time, so perhaps it was a holistic response to the dot com crash that was beyond just 6.001.


I think "panicked spasm" is more accurate than "holistic response", at least in connotation, but as I recall the only real changes were in what was required, terminating the use of Scheme in the entire required curriculum with extreme prejudice, and adding, what, 6.005? Where they claimed to teach most of what was in 6.001/SICP, but using Java, a language which is "not even wrong" for that purpose.

Yeah, MIT has become a Javaschool, plus Python....

And just when the failure of Dennard scaling was making functional programming a lot more important.


No, there were many changes to the curriculum, and for a variety for reasons.

They created 6.01, which serves at an "intro to EECS" so it involves both EE and CS, as opposed to the old intro 6.001 which was really an introduction to CS. Some argue that this course is easier (I never took it, so I can't say) which could definitely be argued is to make EECS a little more gentle and open in response to the dot com crash.

They also broke up 2 very difficult courses, 6.046 introduction to algorithms and 6.170 lab for computer science, and at least put some of their course work into new courses. Again this could be seen as making the entire major a little gentler.

They also changed requirements. In the past there had been a lot of CS focused students who were uninterested in doing any EE, and were choosing to major in 18.c (math with computer science) to avoid an EE course load. The department right thought it was a little crazy that people were leaving the CS department in order to focus more on CS, so they lightened the required EE coursed for 6.3 (a major focused on CS) and vice versa.

This is all my recollection from the 07-09 era and from talking to some students since. There could be some errors in details.


> And just when the failure of Dennard scaling was making functional programming a lot more important.

Yep, there's the real irony. Having functional programming skills and experience is a real asset in today's job market, I've found.


I took 6.002x when MITx was first launched. I don't know how similar it is to 6.002, but if it's still available, I highly recommend it.


Eh. Many students will have the rest of their lives to perfect the art of poking at a library. Getting the chance to play with the more sublime CS stuff is much harder outside of university.


Students also have the rest of their lives to learn the sublime CS stuff. Getting as many students motivated to learn CS, with Python, is a perfectly reasonable goal for a college.


This is how CMU does it. First course: Python. Most people who already have a CS background (usually AP CS) and are ready to dive in skip it. People who aren't sure take it and typically have a phenomenal experience. Final project is "make something using a Python graphics library and spend at least 20 hours on it". Then they have a little demo day where they show off the best of the projects. Second course: data structures and algorithms and C.


That's a perfectly reasonable goal for a high school or someone performing self study. The whole point of a university is to impart knowledge of subtle things.


The majority of CS students (including MIT) never studied CS before coming to college. Unless you propose MIT/CMU/Stanford/Berkeley/etc. only admit people to their intro CS classes who have studied CS before (chicken and egg problem, anyone?).


Well, MIT's intro calculus classes are overwhelmingly for people who have studied calculus before. Making it an actual requirement wouldn't change much.


1) MIT definitely offers calculus classes for people who haven't taken calculus, although obviously these people are in the minority.

2) Most high schools don't offer CS. 99% of high schools offer some form of calculus.


I'm just pointing out that there's no "chicken-and-egg problem" with MIT requiring CS exposure for its intro CS classes.


Errr, when did this happen?

As of 1979, the 18.01 I took matched what I understand is the the AP Calculus BC sequence, and while having previous exposure to the calculus certainly helped, it wasn't assumed.


How many freshmen come to MIT having never studied calculus?


Exactly, by all means force your students to use different libraries and program in modern paradigms which may be centred around different libraries but I wouldn't do a CS or SE degree where that was the focus. I didn't go to uni to learn any programming languages, it was just a side effect of everything else I learnt about computers.


the only thing that makes it harder is lack of time. but then that problem exists in university too. with one you can work on your own schedule during limited free time, and with the other you can devote a lot of time to it but it's on someone else's timeline.


I really wish SICP had been a 2nd year course (with a requisite increase in difficulty) instead of my very first course in the EECS department. Not having had a ton of background in programming beforehand, I feel that a lot of what SICP has to offer was lost on me to some degree due my not appreciating it at the time.

I suppose the same could probably be said for any intro course or just college in general...


I took 6.001 in the late nineties and hated it/did very poorly. I found it waaaaaay too difficult as a freshman. Oddly, doing a bunch of C++ and Java in high school made it even more difficult. Biggest problems:

1. the programming environment (an emacs clone in scheme) had an extraordinarily steep learning curve 2. S-expresssions were hugely difficult to visually parse and edit vs. languages with more familiar syntax. 3. very little exposure to practical projects in the class - felt like constantly working on toy projects or impractical abstract constructions

I got a lot more out of 6.170 (software engineering lab) and the other computer science classes.

I have a much greater appreciation for the class now after 15+ years and recently worked through SICP again. It's much easier with more programming experience, not to mention Emacs paredit mode.

I always thought 6.001 should have been a 2nd or 3rd year course. I would have gotten a lot more out of it.


I first read SICP 10 years into my programming career. I felt like that was a good point to read the book. Any earlier and much of it would have been lost on me. But maybe this is the kind of book that you read and re-read and always find new things.


My university used to have an Engineering Fundamentals class that all Engineering students were required to take their first year. Among several other things, it taught how to use Excel and more importantly, programming using Fortran. It wasn't in-depth, but you learned what programming was in a pretty easy environment.


We had a similar freshman pan-engineering course at Texas A&M that taught both excel and fortran among other things. Probably the most practically useful course in the entire program.


My university offered "intro to unix". It wasn't required for anyone, but it did fill a general education requirement.

It was also one of the most practical, useful classes they had. A CS student who didn't take it (probably most of them) would still pick up most of the material by osmosis, but I think that class was a great idea.


When 6.001 was introduced, a surprisingly large number, perhaps a majority, of MIT students arrived without having leaned how to program. SCIP was their first exposure to programming.

That's inconceivable to me now.


I think Berkeley's EECS curriculum starts with functional programming instead of imperative. Need someone to confirm.


At my university it was a 3rd or 4th year course (depending on how quickly you were able to knock out the prereqs), and I think that was a great approach. At that point you've used programming languages enough that creating your own interpreter is a fascinating experience.


I was in the opposite situation. I'd been programming (mostly self-taught) for so long in imperative languages that I really struggled with 6.001, ultimately dropping the course. (I was course 2 taking .001 for "fun".)


In Math, you take Calculus. It's four semesters. Then you take Real Analysis which is the exact same material all over again, but you actually learn it and prove the theorems instead of just memorizing a few formulas and blindly applying them.

It's the same in Economics where you take Micro and Macro and then junior or senior year you do the same all over again but this time you take it seriously with logical reasoning instead of just memorizing a couple stock phrase explanations.

SICP wants to be both. It's a great book so maybe it can do both. But it's hard for a class to do both.


Today I learned why I hated calculus. Real analysis sounds like fun!


This is a really good way to learn it, best class I ever took: http://www.jiblm.org/downloads/dlitem.php?id=66&category=jib...


All that said, it is still worthwhile to work through all of SICP, if you want a deeper understanding of how certain tools work. Writing your own interpreter is a very rewarding experience.


Having to write my own shell in C for my operating systems class absolutely blew my mind. And then after that my professor made us implement a quine in the shell we just wrote. Yes, he was insane.


Which class/university was it? Any chance the materials are online?

I'm actually writing my own shell now; I'd be interested to compare :)

The fork/exec/pipe/dup calls are definitely an unusual and powerful paradigm. I think too many programmers don't know this because we were brainwashed to make "portable" programs, and that's the least portable part of Unix.


Operating Systems at Bard College, a decidedly not top tier CS school but I still learned a fair bit :)

The materials, as far as I know, are not available online. But we used Tannenbaum's book which you can read for free here - http://stst.elia.pub.ro/news/SO/Modern%20Operating%20System%...

Enjoy!


CS61 at Harvard covers that stuff, building your own shell (and memory management, etc) in C and it's an introductory class. There is a lot of hand holding of course but I really loved that class.


Thanks! Found it: http://cs61.seas.harvard.edu/wiki/2015/Shell

This looks pretty comprehensive for an undergrad project actually... they have a grammar and everything, with pipes and redirects, which are definitely the most Unix-y concepts. And they talk about interrupts, although no user-defined signal handlers or anything.

FWIW I ported the POSIX shell grammar to ANTLR, though I'm writing it in C++ (and prototyping the architecture in Python). ANTLR is actually incapable of expressing the lexer AFAICT...

http://pubs.opengroup.org/onlinepubs/9699919799/utilities/V3...


Harvard? Never heard of it. Good school?


Does not have an ABET accredited CS major, for what that's worth.


Genuinely curious: what are some of the interesting things in writing your own shell? Was it the programming language interpretation aspect, or having to learn the kernel's API for forking tasks and redirecting, etc.?


Well it was my first low level experience so a lot of it was new to me - I hadn't even used C before so pointers alone were a revelation of sorts. I was blown away by the idea of interpretation and then once we delved into forking and processes (and zombie processes), well I truly went down the rabbit hole. It was really cool implementing your own system hooks with Minix - being able to press F12 to get a formatted tables of all running system calls was supremely satisfying. I fully appreciated how complicated and yet in a way beautiful OSes could be and it forced me to learn how to sweat the small stuff and build good software from the ground up.


Not OP but also had to write my own shell, as well as the kernel APIs for forking and whatnot. The kernel APIs are definitely interesting, but I think the biggest dopamine rush was once the way to parse and structure commands "clicked" and suddenly shells made a whole lot of sense.


also being able to look at the source code for a kernel and saying "ah ha! so that's where the process table is created" is also a lot of fun...kind of like uncovering ancient artifacts and learning to read the glyphs


this is going to sound crazy, but in eighth grade I took a CS class where they had us create an interpreter in lisp. It was amazing, and I haven't done anything quite like it since (though dan boneh's crypto class on coursera was close).


That sounds fun! Lisp is great for that because of its code-as-data principle (I use clojure in my day job and it makes dealing with structured data like JSON an absolute breeze). Did you go to an advanced school or something?


well, i started coding in 1st grade (logo, then basic) but this was a summer program hosted by Johns Hopkins. It was highly theoretical - we learned about DFAs, NFAs, epsilon-NFAs, and turing machines.


Wow, I had completely forgotten about writing that interpreter for the PURPLE language [1] in that class. Thanks for the memories!

[1] http://www.mattababy.org/~belmonte/Teaching/CS1/PURPLE_proje...


Were you in that class with me? drop a line. Would love to hear from someone else in that class!


Did you take the class in Los Angeles in 1997?


Awesome! I started in the 5th grade with Perl (lol) and went to a bunch of summer programs but they were more hands-on and programming based. Very cool. I learned about state machines when I was 16 and would draw my own really complicated ones for fun haha.


Yes, he was insane.

More like a pedantic jerk.

It's good to know about the existence of quines. But people are (presumably) paying good money for (and more important: investing their perfectly valuable time in) their education, and there's only so much time in a systems programming class, and so much more fundamental stuff to cover (or cover more robustly).

The people who really need to figure out how to write quines will no doubt find time to do so, in the dead of night, no matter what you may try to do to stop them. (And try to make them 3 bytes shorter than the shortest known version in that particular language). The rest -- they just need to know that they're out there.


Chill out, Franz. It was like a one day lab assignment and to be fair, I had no idea what quines were so I did learn something interesting and it led to an incredible discussion about Ken Thompson's "Trusting Trust".


I think going through SICP and most of all, writing my own interpreter, made me an all around better programmer. I noticed my systems thinking clarified considerably. Not in an aloof way but by providing actual actionable insights on how to implement and structure things.


Wow.

I took 6.001 as an undergrad at MIT. It changed my view of software forever.

Years later, I now spend most of my time training programmers at hi-tech companies in Python. The ideas that I got from 6.001 are pervasive in my training, and are fundamental (I think) for programmers to understand if they want to do their jobs well.

Given that I teach tons of Python, you might think that I'm happy that they switched to Python as the language of instruction. That's not the case; I think that Lisp offers more options for thinking about programming in different ways.

One of the great things about 6.001 was that it didn't try to teach you how to program for a job. It taught you how to think like a programmer, so that you could easily learn any language or technology they threw at you.

Oh, well.


Poke all you like. I'll just be over here writing software that isn't a broken pile of hacks. SICP is one of the most important books I haven't read. It's actually on my shelf right now. I still haven't finished it.

The fact is, as a self-taught programmer, programming is intimidating. I can reason about code, and write it, and understand it if I squint at it long enough, but I still choke on any production code I read, and have trouble solving complex problems. SICP is a book all about this. It's about building better abstractions, and despite having not yet finished it, is probably the most helpful books on programming I've read. Many books can teach you how to program. SICP teaches you how to program /well/, and why.

Along with The Lambda Papers, it taught just how powerful Scheme could be. And I maintain that Lambda the Ultimate Declarative contains the only good explanation of what a continuation DOES.

It was the book that made me want to go to MIT. I don't know if I'll ever go there (I'm still in high school), but if the people there are advocating "programming by poking," it probably wouldn't be worth my time.

This book changed my life, and I haven't even finished it yet. It should be Required Reading™, and the thought of doing it in Java makes me sick. And not just because I despise Java. Java is probably the worst language for this sort of book. SICP is an exploration of programming paradigms and concepts. Java is married to one set of paradigms and concepts, and won't budge an inch against even a hydrogen bomb.

Besides, imagine a metacircular evaluator in Java. Yuck.


Some day we will recognize that some areas of "programming" are very different and require different skill sets, and eventually different titles.

We tend to call everything "software engineering" so that everybody can feel proud of such a title ("I'm an engineer"), but engineering is certainly not about figuring out how to vertically center divs with CSS (and it's also not about proving algebra theorems either -- even if it can be essential when it comes to specific problems that require it).

I can't imagine Linux and PostgreSQL being built without "science", they use a lot of it, and I'm pretty sure the authors all have read SICP and those theoretical books. Poking at things proved to be efficient to building things quickly, but it's just not how one builds critical systems/software that are robust, efficient and maintainable.


Engineering (no matter whether mechanical, electrical, software...) is the process of designing an artifact to be constructed out of available materials, which meets a set of requirements while minimizing cost.

In mechanical engineering you design your artifact using off-the-shelf bearings, motors, pumps, etc.

In electrical engineering you design your artifact using off-the-shelf cables, contactors, relays, VSDs etc.

In electronic engineering you design your artifact using off-the-shelf ICs, resistors, capacitors, resonators etc.

In IC engineering you design your artifact using off-the-shelf silicon wafers, etching chemicals, core/logic designs etc.

It's turtles all the way down, and software is no different.


Most engineers are working under some quantifiable or standard set of requirements, rather than ad hoc "make it work good and look pretty" requirements. Most engineering disciplines also have processes to ensure that its adherents take the proper precautions to avoid poor and unsafe designs, delivered in standardized sets of guidelines and recommendations.

And many programmers aren't engineers, they're just interested tinkerers; people who play around in their free time enough to know how to make something work. Not unlike if you went to the store, bought some wires and batteries and tools, and then played with them until you got hired as an electrician.

Sharing culture is instinctive. People will do it. You might as well try to tell people they can't have sex without your permission unless they pay first. Oh wait, that's the porn industry. Everyone pays for porn, right?


> I can't imagine Linux and PostgreSQL being built without "science", they use a lot of it, and I'm pretty sure the authors all have read SICP and those theoretical books.

Unless you restrict "authors" to the people that worked on the original Postgres95 (and maybe not even then), I'm certain that that's not generally the case (being a postgres committer and not having read SICP).


Software Engineering is more about methodically solving software problems than it is about which problems are being solved. A web developer who writes rigorous formal tests for a new page is engineering just as hard as an embedded developer writing rigorous acceptance tests for a board-support package. The engineering comes from the rigor and the fact that there is a controlled process for how software features get implemented.


I agree with that. The engineering process can be applied on every type of problem.

Yet, for that specific web dev problem, I haven't seen any way of formally testing the rendering web pages, which would make it consistent on every browsers. The testing process (almost) always leave that up to the developers themselves, and refreshing pages is the norm.


There's a quote I love from the book "The Idea Factory: Learning to Think at MIT":

Freshman double E's take six double oh one (a.k.a. six double no fun) to learn to program in LISP... This is also where they begin to leave the rest of the world behind them in their ability solve problems and make things work.

He goes on to describe how the course instills the virtues and limits of abstraction.


Would you recommend it for people thinking about institutional culture/school design?


The MIT book? No, it's just a fun book about MIT culture, the analysis doesn't run very deep IIRC.


I love that book.


I read about half of SICP and thought it was OK. Not great, but okay. The programmers I've met fall largely into two groups, those who like systems level programming, knowing how the OS works, how it interfaces with the hardware, what the memory layout is like, etc. and those who like abstraction and the things that SICP values. I'm definitely in the former group (but I certainly appreciate people who prefer the SICP approach).

There's room for both of course, but for people like me SICP was really a slog. Some of the exercises were hard sure, but more than that the material just wasn't that appealing to me. I don't have any comment on whether MIT's decision was the correct one, but liking SICP or working through SICP is by no means a prerequisite of being a good programmer.


Hmm, I didn't get your dichotomy... By "half" did you mean the first half? IMHO the main dish of SICP is chapter 4 and 5 (ch1-3 is just laying foundations), and the content of latter chapters is strongly connected to system-level things like compilers, run-times, processor design etc.---it's not as immediately applicable as like the Dragon Book, but SICP gives you the perspective, in a sense that it shows broader possible design space within which you can locate the current OS / hardware / language technology.


I meant roughly half by page count. I stopped after that point because I was bored.


Yeah I agree that the first 3 chapters are somewhat archaic, and especially if the reader already has programming experience it looks like it is reinventing some mundane features.


The problem is that most programmers today aren't apart of either of those groups but apart of a third, totally separate group: programmers who pick and prod and then glue lines of code together.


SICP is programming for the theoretically inclined. It seems analogous to calculus in math versus calculus for physics: you can study it more formally with all the proper proofs and derivations (and bizarre cases), or pick up just the applied bits (such as chain rule and dot notation) that you need for doing AP physics. This analogy suggests the existence of two approaches, with different implications and consequences for programming culture.


I think that is true to some extent, but Knuth's approach (which I'm more drawn to) is also theoretically inclined, albeit of a different approach than SICP. Knuth (and myself) see programming as fundamentally being about computers, and Knuth starts with what a computer can do and builds from there. Ableson and Sussman see programming as more about computation, so they start with a model of computation (based on scheme, lambda calculus, etc.). These two approaches are quite different and I don't think you can reconcile them easily. Not that either one is all that much better than the other, though Knuth's is certainly more efficient. It seems to me that a large part of which you favor comes down to how you're wired.


Are program meant to tell the computer what to do? (Knuth)

Or are computers meant to execute our programs? (CISP)

Personally I lean towards the second view, for a simple reason: we design programs much more often than we design computers. Computer design doesn't take much of humanity's time, compared to programming them. So I'd rather have the hardware (and compiler suite) bend over backwards to execute our pretty programs efficiently, than having our programs bend over backwards to exploit our hardware efficiently.


>The programmers I've met fall largely into two groups, those who like systems level programming, knowing how the OS works, how it interfaces with the hardware, what the memory layout is like, etc. and those who like abstraction and the things that SICP values. I'm definitely in the former group (but I certainly appreciate people who prefer the SICP approach).

Many of us are in both groups, which is how stuff like Rust came about.


True, but the number of people who are good at both is very small :)


Really? Well, more money for us then.


This is the way of real life. We breathe, eat, excrete all without understanding or knowing. We poke at our bodies with junk food, exercise, video games, and other stimuli. Sometimes we achieve the desired effect, most times not. We still do not really know how we work. Poking at complexity is ALWAYS how science was done.

And the high priests of computer science strove to build a corpus that was intimidating, complex and beautiful. They tried to be gods and birth a new form of life. That computer scientists applied Socratic and Aristotelian principles to their framework is hubris. Human individuals really cannot be gods to machines that are useful. We have constructed masses of spaghetti code into libraries that no one has the time to read or understand and it is our own fault. These are tools and should have been kept simple and open and easy. Perhaps AI will evolve to save its parents.

I have a dream of the day when I can talk to Alexa, and write code for her in English just by speaking. Her built in projector would show me what I said. My words would be translated into correct Python, c++, or JavaScript. She would highlight errors that would lead to build failures. Point out race conditions. And tell jokes along the way.


> Sussman admitted that the SICP curriculum was “more coherent” than what they have now and that they still don’t know what the right curriculum should be.

The article should have led with this because I believe it would have framed the rest of it more accurately.


Did SICP at Berkeley. Really awesome course. Thanks MIT for not teaching it anymore. Makes me a better programmer than your new grads now. Hah!


It doesn't, is the point that the folks who came up with SICP are trying to make.


Well the real issue is defining what's better. And the worry some of us have is this new trend of grappling with increased complexity seems short-sighted; we don't like the direction it's going. As much as Sussman shows understanding of the current situation, what he's saying is also a criticism--doing this sort of ad hoc "science by poking but not really science" is the heart of the issue.

And what's interesting is if you look at actual scientific research around programming, say dealing with concurrency and advanced tools like model checking, etc—all of that is very theoretical stuff that assumes you know SICP or have equivalent foundations. So it's not really an argument that theory is dying; in face of this new level of complexity in practice, perhaps we could benefit from theoretical research now more than ever.


It's hard to argue with productive results, however. If this "poking" results in valuable engineering more than SICP does, then "poking" needs to at least be seriously looked at and understood, if not outright taught to the next generation of software engineers.

The real problem, I think we can all agree, is the lack of widespread specialization of degree programs. Computer Science is still the blanket degree everyone gets, when in reality, some kind of trade program is likely sufficient to train many of today's "developers" (the "pokers").


My point is that it DOES. My single statement assertion was making that implicit assertion...


Why do you think you know more than they do about this?


I am simply expressing my preference and opinion.


Yes, and I'm asking what reasons you have for holding that opinion.


Why don't you take a look at the state of npm


"Programming environments consist of gigantic libraries with enormous functionality. Some of which works. This leads to what I call ritual-taboo programming. People follow the rituals and avoid things they're told to avoid. This is reflected in programming books. Programming books were once reference manuals, which listed each function and what it did. Now, they're enormous compendiums of examples.

It's sad, but MIT made the right decision. MIT used to produce people who designed elegant, perfect little jewels of code. There just aren't many jobs for those people. Algorithm design just isn't that big a deal any more. Most of the useful ones have been written and are in libraries. Who reads Knuth any more?


The article is based on this video segment, from GJS: https://vimeo.com/151465912#t=59m36s


If you haven't watched the SICP video lectures [1], they are really great.

[1]: https://www.youtube.com/watch?v=2Op3QLzMgSY


Looks like another face of a centuries-old debate between theoretical and experimental science.

The theoretical scientists play with abstractions they fully understand. Experimental scientists poke at things they don’t fully understand. They tend not to do well with each other.

When computers were in the infancy, they were viewed primarily as a scientific tools to crunch numbers and play with abstraction. That’s the theory-oriented view that I think is shared by the SICP authors.

Then computers became complex, fast, and capable of doing much more than crunching numbers and evaluating those S-expressions. While you can view a modern web browser, or MS Word, or a videogame, as a program for a Turing complete machine, this does not mean you should. Too little value in that: sure there’re instructions and state, but now what? More experiment-oriented mindset usually works better with those kind of problems.


> “More like science.

ummm, I hardly see anyone and by anyone I mean even people > 20 years of experience doing empirical research while solving problems. Maybe the _like_ needs more emphasis? We don't really teach science or the design of experiments. This is arguably the most important intellectual tool we have ever developed and it should be the core of all education.

Programming by poking that I see folks doing, isn't science. It is stabbing in the dark until something appears to work. There is more to computational thinking that throwing some APIs together in a blender and making a cool demo.

As a longtime Python programmer, I wish more people would start with Scheme. Most Python is written like some dynamically typed Java; people are using maybe 30% of the power of the language.

I really wish I could have experienced those SICP lectures in 86.


> The “analysis-by-synthesis” view of SICP — where you build a larger system out of smaller, simple parts — became irrelevant. Nowadays, we do programming by poking.

I imagine many of us do programming by poking. But the skill set referred to here is essential if you're going to build (create/construct) a complex system or architecture.

To say the skill set is irrelevant is to say we as an industry are doing things more or less right. But when I look around, I see that we're making a huge mess of things everywhere or are at least sorely inefficient when it comes to building and growing systems. That is to say, I believe (I know) that we could be doing much much better than we are now -- and I believe the skill set being dismissed here is a large key to doing better. So I'm sad to see it dismissed like this.


That's a shame because the theoretical grounding that comes from SICP is enough of a clue to tell you why black box library X is crashing, runs too slow, lacks feature Y, etc. and maybe even enough of a clue to help you modify/hack it to get feature Z as a workaround.


This isn't entirely accurate- SICP is still taught as a 4-week intensive during MIT's IAP (January) term. It doesn't fulfill any requirements, but the student evaluations are excellent (averaged a 6.7/7 over the last two years).

http://web.mit.edu/alexmv/6.037/

http://student.mit.edu/catalog/search.cgi?search=6.037


I think it's also very natural for this time that "[the past solution] was “more coherent” than what they have now and that they still don’t know what the right [current solution] should be." If you ask around you that's how most people feel about their job, their families, some more advanced people even feel that way about their gender, political position, life goal, philosophy.


I was wandering what happened to those HP employees from SICP lecture videos. Where are they now, what careers they had, etc...


I took the course in Fort Collins in the 1980s. (I think it may have been before 1986? I left HP in 1986 to stay home and raise my daughter, and went back to software development at other companies in the 1990s. I have taught college-level intro programming part-time for 25 years, and high-school level computing (including Java and Python programming) for 13 years.

I was greatly impressed then with 6.001 and still am today. I have worked through most of TECS [0] within the last year, and I want to go back through SICP next year (after I retire :-)

[0] The Elements of Computer Systems http://nand2tetris.org


Personally, I think CS students need to understand both programming by abstraction and programming by science (poking). The abstraction stuff is important for designing maintainable code. The science stuff is important for maintaining code.


If you're learning Clojure there's a site/book that incorporates much of the original SICP text: http://www.sicpdistilled.com/


I'm not techie, so i'd like to ask if my interpretation is ok. The switch from scheme to python is in the tidal change of conceptual thinking? Previously, giants & pioneers approach AI systematically, w consistent logic and clean structure (like maths language).

However, for now, the narrative shifts to "poking around, hoping to poke the right thing"?

I use Racket & its thinking for research. I think that structured thinking is still intellectually important. So it'd still survive, as there'd still be people sticking w it.


As someone who took the replacement 6.01 Intro to EECS, I would have much rather taken 6.001 SICP.

I'm sorry to say, but I feel very strongly about 6.01 being the least useful course I took at MIT.


As a college student who followed along Berkeley's SICP lectures online in 2015 (taught by John DeNero), I am kinda confused by the Scheme v/s Python debate. Did I miss out on something having used Python in the entire first half of the course v/s say CS61A taught completely on Scheme (like when taught by Brian Harvey)?


Yes, I believe so. I think Brian Harvey puts it aptly, "The big ideas in the book — the ones that alumni in the real world tell us they’re using in their work — express themselves best in Scheme."

[1] https://www.cs.berkeley.edu/~bh/proglang.html


Ahh. Do you think it will really add value to re-take the course under Scheme exclusively?




Hal Abelson on the end of SICP at MIT (2011)

http://codequarterly.com/2011/hal-abelson/


"in the 80s and 90s, engineers built complex systems by combining simple and well-understood parts.... Today, this is no longer the case."

yep:(


Great concept! Soon monkeys will do all the poking. Humans can just watch.


I wish the original course is offered in an alternate platform like edX.


The course material from 2005 (original 6.001) is available on OCW at MIT, including videos of the lectures.

http://ocw.mit.edu/courses/electrical-engineering-and-comput...

That's an older and less polished online courseware system than edX, but it's perfectly workable. There are also VERY old videos of Sussman himself giving the .001 lectures. Those were easy to find, but I'd recommend the OCW site over the original lectures.


Yes yes.

Old school, understanding based, composable via interfaces components gave us things like R4RS, Plan9, git, LuaJIT, nginx, Erlang, you name it.

That packing or poking crappy mindset produced so called eneterprise software with all its ole, jdbc, javaee, corba, NodeJS, and all the other meaningless bloatware.

BTW, this a human-universal law. Lack of understanding of grounded in reality fundamental principles will enivetably produce piles of abstract nonsense, be it philosophy (of Hegel), [Java] programming, theoretical physics or even math (500 pages Coq "proofs" of abstract nonsense).

There is the law of cancer (in terms of 4chan) - any branch of human endeavor will be ruined with enough of cosplaying idiots.

BTW2: teaching of principles is too difficult, because, like a good philosophy, which is a study of principles, it requires extraordinary trained and lucid mind habitually trying to separate underlying principles from human-made "higher" or "pure" nonsense.

This habitual mental hygiene, which guards the principles from contamination by popular memes and waves of mass hysteria is very rare trait, impossible to sell or monetize. To teach it requires a similar mindset in students. Packers would be unable to grasp what is it I am talking about.

The Motorcycle Maintenance book, and that part of Atlas about modern philosophers will be a much better illustration.


So painfully true. This is one of the very first things I try to determine when I'm interviewing someone: do they have the capacity to actually figure out how things work, or are they doing something rote that they picked up along the way and will just blindly change code until "it works" when they hit a serious obstacle?

And I've also worked with plenty of very senior engineers who will blow smoke up your ass when you're really down in the weeds and you want to do "root cause" analysis.


I still swear by SICP and TAOCP for anyone who wants to learn CS seriously.


For working as a professional programmer, SICP, sure. But TAOCP? Why?


I wrote that for anyone interested in learning CS seriously, I swear by those two books.

I would recommend an entirely different set of books for someone who wants to work as a professional programmer.

That being said, TAOCP really gave me a clear understanding of algorithmic analysis. It's a hard book, but after you finish it, you won't look at programming the same way again. Especially when it comes to design decisions for systems where efficiency is a must.


What books would you recommend for someone who wants to be a professional programmer?


So, did you read the backflap and sent the email to Bill Gates as he suggested ;).. jk.. jk

How do you about working with such a hefty tome. I only use it a (light)reference. Any tips on how you went about it?


I got up early everyday and read a chapter before work and tried many of the exercises. Now I use the whole set as an almost daily reference where I work. The prelim math you can get from MITs 6.042 either on OCW or the reg school site https://courses.csail.mit.edu/6.042/spring16/class-material.... if you just want to understand on an applied level what's going on. I did it in MMIX using the book 'The MMMIX Supplement' by Ruckert to check answers.

Re: SICP I noticed Harvard and other schools typically have an intro to CS course then a course after in abstraction using OCaml or other functional language. The syllabuses I've found for these second semester CS intro courses looks almost identical to SICP ToC except no hand rolled compiler which is probably the single most useful chapter I've ever read in any CS book.


One reason I went with different algoritm books is that they were for languages I use. Did the use of a fake ISA hamper your understanding at any point?


I think it works for the same reason _A Modern Method for Guitar_ works.. by not using a familiar context (tunes/languages), you have another vantage point to help internalize the lesson.


That could be. Someone redoing Knuth's stuff with his lessons and modern asm/HLL might create an interesting effect.


Just curious, what is your definition of "finish"? Read it cover-to-cover? Solve all the problems?


Thanks for clarifying! Please post a list of books you will recommend for professional programmers.


TAOCP is also great because the implementations are in assembly language, which encourages you to think about efficiency in terms of the operations your machine performs, not statements in some high-level language. Not only is the writing excellent, but it is clearly written by a programmer with much practical experience, unlike some other algorithm texts. I deeply wish there were more modern books in the same style, with the same spanning focus of high and low level concerns.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: