A number of these have to do with statistics and calculus, but really isn't pertinent to overall software development. Yes, a math background helps with certain kinds of development, but it doesn't preclude general success in software development. I'd be more interested in questions about multitasking, threads, dynamic storage options and so forth rather than what percentage of spam hits my server. That's a basic math question.
Also, most of the math in this part of the quiz is not what I've found the most relevant: it might be applicable to your ___domain, but not to programming in general. You can get quite far without dealing with any sorts of numbers except maybe natural numbers.
On the other hand, it didn't go into things that really are applicable to all programming, like formal logic, type theory and semantics.
For example, I think it is very easy to overstate the importance of calculus in programming. There are very many interesting applications that don't require any calculus at all. My primary interest is in programming languages, and we like to pretend that real numbers don't even exist most of the time :P.
On the other hand, formal logic and semantics is useful for reasoning about any type of programming. You don't even have to be entirely formal about it: just having a basic idea of how you could formally reason about a program helps you be clearly in entirely informal ways! For example, I've found that designing clear, easy to use libraries is far easier if you just consider the denotational semantics of your constructs completely informally: it makes it much simpler to think about the meaning of your code divorced from its implementation and makes the API and documentation much clearer.
Similarly, knowing a bit about axiomatic semantics helps you deal with things like loop invariants. Sure, you'll probably never have to prove any properties about your code in great detail. However, just being aware of these ideas helps you write the code in clearer ways and maybe even write better tests.
Type theory is also broadly underrated. Even in a language like Java, you have the same considerations--you still have to worry about covariance and contravariance, for example. Being aware of the simple and elegant models that are usually used for type systems can help you navigate Java's jumbled mess of inheritance and generics.
Anyhow: math is useful for all programming, just not necessarily the math in this quiz.
Colin, if we start with the hypothesis that "any developer really ought to be able to solve [that problem]", we get a testable prediction: If a person cannot solve that problem, that person is not "really" capable of being a developer. This prediction, however, doesn't seem to survive long when exposed to reality: There are lots of people who cannot solve that problem but are succeeding, right now, in jobs that most people would accept as software development.
Software development is a big space with lots of room for specialization. Maybe it would be better, then, to model what is essential to "development" not as a distribution over knowledge but as a joint distribution over knowledge and specialization.
We're very good at working around our deficits. The fact that someone can be a successful developer without knowing basic probability doesn't mean that a lack of knowledge of probability isn't a deficit; they might be a much better and more successful developer if they filled that gap.
> they might be a much better and more successful developer if they filled that gap.
That's certainly a valid hypothesis, but do we have any evidence to support it? I would be looking for causation, and not correlation (i.e. many great developers are familiar with calculus, because many great developers have CS degrees).
Let J be a random variable ranging over software jobs and distributed as in nature. Now let c(j,k) be the penalty for having to work around a lack of knowledge k while holding job j.
You seem to be saying that for certain values of k (e.g., k = “basic probability theory”), E[c(J,k)] is large enough that not having k qualifies a “deficit.” But the empirical evidence suggests that there’s a sizable set of respectable jobs X for which these deficits don’t matter. That is, for those exact same values of k, E[c(J,k) | J in X] is so small that the market effectively doesn’t care enough to withhold those jobs from people lacking k.
All of this is to say that when you say “software developers,” I think you’re imagining a set of people which is a lot narrower than the set the market actually maps to that job title. Today, “software development” admits a lot of jobs that don’t have a large dependence mathematics or computer science. The guys hammering out HTML and CSS all day long probably aren’t suffering for their probability-theory deficits. And yet most people do consider them software developers.
I don't necessarily disagree in this case. Though I can't think of many times in my career where I had explicitly calculate probabilities, though may be an outlier.
I think it's more a concern about a general attitude sometimes displayed on HN like "You don't know how X86 handles memory alignment, dude you even code?"
I wouldn't be so sure. For instance: There's this thing called A/B testing that gets mentioned, oh, every twenty nanoseconds or so on HN (more often when patio11 is awake), which (1) is very much the sort of thing people focused on front-end work need to understand and (2) is all about probability.
Yes, it's a basic math question... and one any developer really ought to be able to solve. But does it belong on a Software Development test? I argue that if you blew your math class, that should screw up your degree, but it's not specific to software development. In my experience, there are terrific developers, but only a handful are math nuts (and I mean that in the kindest way).
Question 3, for example, is obviously much more higher order math than the first spam question, but question 3 doesn't really have (IMHO) a damned thing to do with software development. The bigger question is whether calculus is a fundamental part of advanced software development, and the article does make an effort to indicate that it is, but I generally find that in a working environment there are a few math-oriented individuals, but most are more logic-oriented (which doesn't preclue math, admittedly), and unless you're on a project where higher-order math is required, you get along just fine. For those projects requiring physics calculations or whatever, you bring in the math geek. That doesn't mean it's not a testable question, but I contend it's not a requirement to be a damned good developer.
For all I learned in university, I've applied less than 1%. Is it just me or is that sad? I learned how to learn, but really overall the skills thought are so disjointed as to make it almost useless...