Hacker News new | past | comments | ask | show | jobs | submit login

It seems there are a lot of gaps in his primers -- he includes topology and some intro to Python, but the actual heart of programming -- algorithms and algorithm techniques -- is mostly missing.

This is of course his right -- it's just a collection of blog posts after all -- but caveat lector.

I would probably go for textbooks instead. Textbooks usually do give a bigger picture in the topics mentioned and they usually have an acceptable learning curve, so no additional primer is required (though as always, googling additional info is a plus).

If you're learning CS, it's often super useful to mix and match two or three textbooks to get a nice picture of things. For example, when you're reading an Algorithms book about graph traversal, it's nice to read a book on graph theory as well. Conversely, several equalities in graph theory can be proven by a simple algorithm (greedy or DFS, for instance).

My personal favourite is mixing up Dynamic Programming (from an Algorithms textbook) with recurrence solving in a discrete math book (which is what DP is actually about) and mixing all that with a more advanced chapter of discrete math -- generating functions.




Blog author here (all of my friends tell me immediately when I get on HN). I appreciate the criticism.

I'm aware of the huge gaps, and I've had low-priority goals to fill them in. The reason is because my blog is not intended to be a replacement for a standard education, but rather a place for me to explore and write about the stuff that I personally find cool. Usually this means I find an interesting application first, and figure out what background information I should present (admittedly tersely) to support it.

Algorithmic techniques are fine and dandy (I do talk about dynamic programming a few times when it invariably comes up in my applications posts), but more often than not I'm simply bored by the standard algorithms lessons and I believe there's a bigger lack of mathematics background in my audience than algorithmic background. Everyone likely knows (at least vaguely) what a hash table is and what it's for, but asking about a group is more likely to receive blank stares, despite the awesome applications to cryptography.

Though I may be wrong about my audience. I have no way of measuring that except to see what sorts of news sites it shows up on; so far no professional mathematicians have seemed to care except to say the equivalent of "that's a cute hobby."

That being said, I do intend to focus more on graph theory in the future. My own research is taking me more into random graphs, randomized algorithm analysis, and related topics. I also want to start a data structure series, where I go through all of the standard examples: red black trees, prefix trees, fibonacci heaps, etc. I would absolutely love to see more applications of these ideas, and I already have quite a few in mind for graph algorithms.


Judging by the article on Bezier curves, I'd say you're doing fine. Very interesting stuff, and a great approach. It even made some linear algebra stuff click for me -- that I should've internalized long ago from my linear algebra class...


I think there are a lot of gaps within the primers too. With a degree in Mathematics and CS, I can only follow through midway most primers before requiring to find additional sources of information.

That being said, I really like them and I am a big fan of Jeremy's writing.


>My personal favourite is mixing up Dynamic Programming (from an Algorithms textbook) with recurrence solving in a discrete math book (which is what DP is actually about) and mixing all that with a more advanced chapter of discrete math -- generating functions.

This is what makes a "simple" primer on algorithms difficult. In order to prove running time or correctness, you need to have a decent command of basically everything else in that list before you start.

It's relatively simple to understand how Dijkstra's algorithm works by iterating through the steps, but it's much more difficult to prove that it works for a generalized graph. Or prove it's running time.

Algorithm theory is basically the end point of an undergrad CS education. It's the point where you take all the theory you've learned up until that point, and turn it into something that can really be applied to computing.


Why does proving algorithms involve understanding basically every else. The almost every proof (of correctness) I've seen has been simpler and easier to understand than most of the proof I saw or was asked to do in high school math. Proofs for the running time tend to be trivial (in terms of getting a reasonable big-O, big-Omega. Getting big-Theta is often more difficult, but still tends to be relatively simple).


I agree it would be nice if the gaps in his primers were filled in, but it's to have a resource like this that can help give an understanding of at least some of the topics related to math and programming.

I know from my experiences that trying to read a discrete math or algorithms and data structures textbook can be very daunting. I've also tried viewing the lectures of the algorithms class on Coursera and usually give up because of the difficulty of trying to learn these things after spending 8 hours coding at work. Not having a background in computer science makes this into an uphill battle. Any resource to make this struggle easier is greatly encouraging to me.


I have a similar problem. It's really hard to work 8 - 10 hours then come home and try to have the steam to dive into a mathematics or algorithms textbook and go through it in a way that really lets me internalize the material.

Could be that I just haven't found the right books or rhythm for this kind of learning, though. Doesn't help that I'm a pretty disorganized person with my time outside of work, heh.

I've read a few posts on this blog before, and remember enjoying them. I'll have to read a few more. I think it's partly because of how much I hated formal education, and partly because a blog doesn't have the same physical presence as a textbook, but blog posts are less psychologically draining to think about than a book, for me.


I'd recommend downloading some lectures. I struggled with Linear Algebra before downloading Strang's MIT lectures, and I didn't understand the Dragon Book before following Coursera's Automata Theory course.


I have Strang's calc and linear algebra books.

I had a really math light college degree, and never went through calc. It's always been pretty interesting and seams to show up enough places to warrant learning it, but I'm awful with time management. I'm bad about not giving myself consistent blocks of time to work through things. I think I'll give strang's calc videos another shot.


If you're serious about wanting to get better at math, I recommend going to sleep, and getting up earlier. I have done this for a while and find it a great way to have some distraction free focus time. In the end, You need to protect some of your time for your own advancement. It's your responsibility, not your employers. Give it a shot.


Since it sounds like you've done it a few times, can you recommend some specific textbook combinations which you've enjoyed?


Some general hints: don't be afraid to skip. It may happen that some books have an intersection, even though they complement each other -- usually the authors try to be self-contained.

Also, most of the books I mentioned have about 50% dedicated to core ideas in the field and 50% dedicated to some specific tools that may not be that useful to you. More often than not the second 50% is the latter part of the book, but in say Concrete Mathematics it is spread throughout the book entirely. I know it's hard to find the balance between reading fast (and then realizing you don't remember anything from the basic tools and you can't even apply them well) and reading slow (and realizing you're trying to memorize some obscure recurrence involving binomial coefficients). Reading is hard.

Now, for the tips for the CS/discrete math students among you. Feel free to add some of yours, I'm hardly the veteran educator here.

Knuth, Graham, Patashnik's Concrete Mathematics, after a few chapters, goes well with Flajolet, Sedgewick's Analytic Combinatorics. You should mix that up with some Discrete Mathematics introductory book, especially one dealing with counting objects, so you can apply the techniques almost immediately on simpler examples.

Papadimitrou, Dasgupta, Vazirani's Algorithms is one of the better books on algorithms out there, in my opinion. As for the graph theory books I read, I liked Bondy, Murty's Graph Theory style the most, so I'd probably go with that, though I haven't really read them in parallel yet.

For the discrete math students out there, after learning the basics of probablistic method (Alon, Spencer's Probabilistic Method being the seminal textbook, I think), you could continue with both some probabilistic algorithm book to apply what you've learned in CS (I did take a course in that, though, so I can't vouch for the best book out there) and you could also grab Tao, Vu's Additive Combinatorics for how to apply probability in number theory (probabilistic method is chapter one in there).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: