This discussion seems similar to the one English departments have about whether you should still read the classics or not. Isn't the point of _higher_ education that it's "higher"? If a high school student can write a blockbuster iPhone app then maybe universities don't need to teach that.
I totally agree with the part about making the curriculum fun and show the applications of the "arcane" theories of CS but I don't think we should abandon the teaching of classic CS.
Colleges transitioned from being a place of higher learning to a place "where dreams come true" several years ago. Preparing your students to build the next Angry Birds or Facebook instead of reimplementing a search algorithm that has been implemented a million times before is much more in line with the goals of both parties. A strong foundation in CS is not necessary, or even important, when your goal is financial success.
I do agree that the fundamentals are very important for those who are interested in pure academic pursuits. It is very unfortunate that college has become the goto place to get a job, not a place to learn. But the truth of the matter is that the vast majority are only in class because they are looking for future wealth. Colleges, being businesses, will naturally tend towards catering to their customers.
The good news is that with the proliferation of the internet, academics can now learn about CS fundamentals even if the formal CS programs go into decline. Of course it is not too late to fix the education system, we just have to get past the idea that college equals job and return school back to its roots of a place to research and study.
No one is going to get past the idea that "college equals job" until it's not true anymore.
The conundrum we're in now didn't start with higher education catering to customers who wanted future wealth. It started with employers offering high-paying jobs realizing that higher education was a sign of all the qualities they wanted, so they started mandating it. Universities adjusted accordingly.
I'm pointing out the obvious, of course, but it's because I've seen a lot of people say things like "we need to get over this idea." That's not going to happen until there is a ready supply of high-paying jobs that don't require college degrees, or until someone finds a method for achieving a high-paying job that doesn't require college but is just as straightforward and successful.
I'm not sure that it is true. There is no data I have been able to find that supports the claim and there has been several articles on HN lately that strongly support the opposite view.
The best I have been able to find on the matter is one study that shows a loose correlation between those who have a formal education tend to have a higher income. Which, of course, says nothing about the effect of the education on the resulting job.
The effects of this vocationalization of the university extend beyond school, too. It used to be that just having a college degree meant a good chance of getting a job in a wide range of fields. Now, though, if you don't specialize or target your degree at a specific field, you're out-competed by people who did.
(Yes, I'm whining about my Comparative Literature degree again, but I pursued that field seriously and rigorously, unlike many of my colleagues, and I feel like I'm being judged unfairly because of it.)
> Colleges, being businesses, will naturally tend towards catering to their customers.
There was a time when people thought about universities as not being a business. Of course, in the present age everything is a business so I shouldn't be surprised at all.
Agreed -- there are still lots of hard theory problems waiting to be solved, and we also still need people to write operating systems and design strong cryptographic protocols. Not everything is as flashy or "cool" as iPhone games, but it's still interesting to some, and those people need to be exposed to these topics so they can find their niches.
(Also, it really bugs me the way he slams his own field -- discrete math is not exactly "arcane".)
I don't think a general CS degree makes sense for most people. What you need are more specialized degrees such as game development, social software, computer graphics, information retrieval, etc.
I think you might be getting too specific, but I'm sad you got downvoted. I think it would be very useful to have 2 tracks of a degrees- a "computer science" more academic and algorithmic training and a "software development" with industry practice, team project focused.
We need the people who can advance the field to get the training to do that- and the people who just want to create the next great company should be able to learn those skills.
>We need the people who can advance the field to get the training to do that- and the people who just want to create the next great company should be able to learn those skills.
The people in the second group should be going to business school or starting their own startup. Either would give them more benefit than a CS degree. A CS degree, even if it's "development" focused, won't give them the skills necessary to deal with management or the business side of things. Ideally, it should give the student the technical and theory background to be effective at any coding job they take. Personal skills are better taught on-the-job or in non-class related, side projects.
I honestly disagree that there should be a separate track. A good developer should have a solid theory background. It should at least cover data structures, computer architecture, discrete math, and algorithm design and analysis. OS design should also be covered, because that takes everything else and combines it into a real-world application. Good developers are aware of all of that, even self-taught ones. In other words, a "software development" degree really requires a superset of what the current good CS curriculums have, because it also includes software engineering paradigms and team project skills.
The real problem with programming degrees is that the industry, as a whole, is still very young. It doesn't have a standard set of best practices to draw on, especially compared to traditional engineering disciplines. I graduated from UIUC in 2007, and the SE course there focused primarily on extreme programming and RUP. Agile wasn't covered at all(and is basically impossible to do in a college environment). So while it was a good course to take for team project experience, the actual topics covered weren't that useful. The other problem is that every development group has a different set of practices, and these almost never actually match the ones taught in school. So while a school might teach a certain set of development practices, the ones that the students actually use might be completely different.
I think both make sense but the specialized degrees could be taught by more application-oriented institutions or part of a different degree (engineering vs. science).
The term "vocational school" sounds a bit derogatory but that might be more suitable for a lot students who are less interested in theory.
Because I think that higher education is a pretty awful place for vocational training - as people here constantly point out. Of course, if you want to do research or do fundamentally new types of things then a CS degree can be a great place to start - but that applies to a tiny minority of people.
I totally agree with the part about making the curriculum fun and show the applications of the "arcane" theories of CS but I don't think we should abandon the teaching of classic CS.