Hacker News new | past | comments | ask | show | jobs | submit login

Colin, if we start with the hypothesis that "any developer really ought to be able to solve [that problem]", we get a testable prediction: If a person cannot solve that problem, that person is not "really" capable of being a developer. This prediction, however, doesn't seem to survive long when exposed to reality: There are lots of people who cannot solve that problem but are succeeding, right now, in jobs that most people would accept as software development.

Software development is a big space with lots of room for specialization. Maybe it would be better, then, to model what is essential to "development" not as a distribution over knowledge but as a joint distribution over knowledge and specialization.




We're very good at working around our deficits. The fact that someone can be a successful developer without knowing basic probability doesn't mean that a lack of knowledge of probability isn't a deficit; they might be a much better and more successful developer if they filled that gap.


> they might be a much better and more successful developer if they filled that gap.

That's certainly a valid hypothesis, but do we have any evidence to support it? I would be looking for causation, and not correlation (i.e. many great developers are familiar with calculus, because many great developers have CS degrees).


Let J be a random variable ranging over software jobs and distributed as in nature. Now let c(j,k) be the penalty for having to work around a lack of knowledge k while holding job j.

You seem to be saying that for certain values of k (e.g., k = “basic probability theory”), E[c(J,k)] is large enough that not having k qualifies a “deficit.” But the empirical evidence suggests that there’s a sizable set of respectable jobs X for which these deficits don’t matter. That is, for those exact same values of k, E[c(J,k) | J in X] is so small that the market effectively doesn’t care enough to withhold those jobs from people lacking k.

All of this is to say that when you say “software developers,” I think you’re imagining a set of people which is a lot narrower than the set the market actually maps to that job title. Today, “software development” admits a lot of jobs that don’t have a large dependence mathematics or computer science. The guys hammering out HTML and CSS all day long probably aren’t suffering for their probability-theory deficits. And yet most people do consider them software developers.


This is true, though arguably lack of any knowledge about anything is a deficit.

Since it's impossible to know everything about everything, it's really a question of "what is the probability that this will be useful?"


Right, which is why I was making the point that there are a great many situations where an understanding of probability will help.


I don't necessarily disagree in this case. Though I can't think of many times in my career where I had explicitly calculate probabilities, though may be an outlier.

I think it's more a concern about a general attitude sometimes displayed on HN like "You don't know how X86 handles memory alignment, dude you even code?"


Being focused in a particular aspect of programming isn't a bad thing.

If you are focused on front-end work, knowing probability probably isn't going to change much.


I wouldn't be so sure. For instance: There's this thing called A/B testing that gets mentioned, oh, every twenty nanoseconds or so on HN (more often when patio11 is awake), which (1) is very much the sort of thing people focused on front-end work need to understand and (2) is all about probability.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: