This has been one of my rallying points for years: in school, you are not taught how to collaborate on a team, or even basic software engineering practices like source control. This is ridiculous because every software developer needs to know this. Even if you become an academic, you should be using source control at the very least!
As it is, you're thrust into the thick of it with your first job.
My only experience during my CS degree with working on a team was interesting because of what it got right and wrong.
It was a project to develop control software for a simulated hardware device. We had a team of 4 collaborating with another team of 4 (working on a different part of the system).
So it was good because it was collaborative and you had to depend on another team. We also had to produce documentation.
But we weren't taught how to collaborate with CVS or anything like that. We sent around compiled binaries by email.
So I support any effort to teach collaboration and common tools as part of the curriculum.
I never liked any of the group work I did in school. Sure, the teachers would assign it, but it always ended up with a few people doing most of the work or a project with as many different parts as there were people. It was fine for me to talk about ideas with other students, but when I needed to learn it, I had to go off and work by myself.
it always ended up with a few people doing most of the work...
This is a common problem with group projects in school. One of the main problems is lack of leadership. The solution: assign a leader.
I had a class consisting of 4 projects, 4 people per group, and each project had a designated leader which rotated. These projects were so much easier to manage, because the leader had some authority given by the teacher. He therefore had the ability to delegate responsibility and people would actually do it. In normal groups, even if a leader emerges, they will often be ignored by less motivated students, because their authority is much more tenuous.
My point: it helps to have an assigned (and verified by the teacher) leader for a group project.
True, this is often the case. I think what Joel is asking for is that students be forced to do at least one project that is long-term and collaborative by it's very nature so you learn the skill of working on a team.
I was in a french high school during my studies and we had about 1 short project a week, 1 middle-term project a month and 1 to 2 long projects a year for the first 3 years. For the next 2 years we had only middle-term projects and 1 big project (end of studies project).
It was really interesting for a lot of points :
- short projects were for 1 to 2 people
- other projects for 3 to 8 people
- You had to use version control
- if your project didn't match the expectations your score
is /2
- expectations with no way to change dates
- You don't always choose your team
- working with assholes and had to handle that
- there were always a "team leader" who took more responsibilities if the team failed
- some of the projects were corrected by bots using hundreds of test cases and at the first test failing it stops correcting (seriously it was a nightmare, you could have the 1st test failing and the next hundreds working ... anyway you will have the worst score) so you begin to write your own test suite trying to test everything even the unthinkable.
...
I'm not trying to do some marketing, I'm not anymore part of the school and not involved at all in it, just trying to say that all the school don't do bad jobs to prepare you for working in a real world.
Can I complain about this article continuing to blur any distinction between "computer science" and "software engineering", or would that just make me sound like a cranky old-timer?
You're not wrong, but I think you're missing the point. The fact is, a lot of computer science students intend to go out into the world and become software engineers. Likewise, many software engineering positions require computer science degrees. Because of this, you'd expect that a computer science degree would prepare you for a career in software engineering.
Many (if not most) university computer science programs do attempt to prepare you for a software engineering career, but the point is that they're failing in several areas. You could call it a software engineering degree and they'd still have the same issues.
That was true when Dijkstra said it, but just as mathematics is increasingly using computers as they go into realms that mere humans can't completely explore without them (and this will only increase over time), computer scientists are increasingly being called upon to actually program more things.
Theorize about a P2P system to your heart's content, but until you've implemented it, you don't understand it well enough to write more than a preliminary (and nearly worthless) paper. And it would be nice if said system wasn't a steaming pile, because those actually take longer to make, you know. Theorize about a computer vision algorithm to your heart's content, but until you try it you're just imagining things. Theorize about how wonderful your robotic control theory is to your heart's content, but until you actually put all the pieces together and see if it actually works you're just dreaming.
Sure, in a world where "Dijkstra's Algorithm" is a genuine breakthrough, you can live without a computer, but that world is disappearing for many disciplines, leaving computer scientists who really do need to program.
(I italicized that because I am aware that there are computer science disciplines that can still get by without computers, like complexity theory. But there are other things that are quite thoroughly computer science that can't do it without computers anymore. Stuff that Dijkstra would never have been able to do in a world where a megahertz was precious.)
My undergrad program has been doing this for about 10 years, and I'd say it was an extremely valuable way to bring together the prior 4 years and give you a taste of professional programming. My school certainly isn't Ivy League, but it is a "Tier 1" liberal arts college, according to US News.
Every CS major has to take a software engineering class his or her final year, which consists of learning real-world methods for software development and completing a semester-long group project. It sounds almost exactly like what Joel describes. My group produced a few thousand usable lines of Java code, on a schedule, with documentation, requirements, and the whole nine yards:
I had a similar experience in my undergraduate program (and I graduated over 8 years ago). In second year, everyone has to take a software engineering course that consists of completing a full waterfall model group project. It was complete with interviews with the clients, every scrap of planning documentation, and of course the code itself. Each deliverable was due every 2 weeks and everyone had to work their ass off.
Future courses covered agile methods, version control, bug tracking, database design, and so on.
I actually feel a bit sorry for people who end up going through a program like Joel describes -- but certainly good schools are out there and they're teaching things programmers really need for the real world.
Having said that, I enjoyed studying theoretical CS at a hardcore-theoretical institution far, far more than I would have had I studied software engineering and learnt how to use test suites and agile and so on. I learnt version control from coding while studying, especially from OSS. I studied usability and had hands-on experience during an internship. I led a project team. My final year project was certainly more than 20 lines long. And yet I also did stuff like ML and BCPL and algorithmic design and complexity and crazy fun stuff like red-black trees and O-notation and quicksort and formal logic and markov chains and SVMs and Bayesian classification and god knows what else it was 2003. If I'd done the software engineering approach, my answer to 'how to implement quicksort' would be 'google', and I'd long since have abandoned 'Subversion 101' for something that actually hurt my brain.
College students in their final year have about 16 years of experience doing short projects and leaving everything until the last minute.
interesting deficit of mainstream educational systems; I wonder what an educational system based on long-term projects would look like + what its effects would be on the students
This has been one of my rallying points for years: in school, you are not taught how to collaborate on a team, or even basic software engineering practices like source control. This is ridiculous because every software developer needs to know this. Even if you become an academic, you should be using source control at the very least!
As it is, you're thrust into the thick of it with your first job.
My only experience during my CS degree with working on a team was interesting because of what it got right and wrong.
It was a project to develop control software for a simulated hardware device. We had a team of 4 collaborating with another team of 4 (working on a different part of the system).
So it was good because it was collaborative and you had to depend on another team. We also had to produce documentation.
But we weren't taught how to collaborate with CVS or anything like that. We sent around compiled binaries by email.
So I support any effort to teach collaboration and common tools as part of the curriculum.