Hacker News new | past | comments | ask | show | jobs | submit login

Your first quip also applies to Mathematica. It's very easy to write quite inefficient code in Mathematica. One classic example of this is a newcomer not using N[] and evaluating every expression symbolically. Seems trivial, but it really isn't for someone new. I still could get speedups on linalg code two years into grad school.



It applies to most languages.

In practice you can't rely on the compiler, because you don't know what the compiler is doing.

You certainly can't assume it's going to make the smartest possible decision for all of your code. There's no standardisation for optimisations, and they're often a context-dependent trade-off anyway.

The only way to write fast code is to learn the quirks of the tool chain and profile production binaries to find the bottlenecks.


Mathematica also has the double disadvantage that it's syntax and semantics are pretty different from what most people are used to. I find it quite pleasant as it's quite consistent, but it's not the most straightforward thing to learn coming from e.g. [naive] MATLAB. When I show people how to use it, the first thing I warn them is "no For[] loops and only use Table[] sparingly."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: