Hacker News new | past | comments | ask | show | jobs | submit login

... What part of linear algebra isn't computational?



Like... most of it? You can't do linear algebra "safely" without doing error analysis. So lots of decompositions and operations are very useful for proofs and finding bounds - but can't be use directly to compute values. It's why a good fraction of stuff people do with linear algebra is numerically garbage


Many theorems let you know that something exists but don't tell you how to compute it efficiently (an orthogonal basis, eigenvalues, inverses, etc.)

Also, sometimes algorithms can be invented and empirically shown to have good complexity properties before there are proofs.


I found the part which use Zorn's Lemma pretty un-computational :) It's equivalent to the axiom of choice and is used to show that every vector space has a basis


Depends on what you mean by 'computational'.

- Computational as in 'using programming and computers'

- Computational as in 'solving a math problem' (as opposed to 'theoretical': definitions/theorems/proofs/lemmas).

My guess is that the posted link means the first kind, hence almost all of linear algebra texts are non-computational, i.e., you can become an expert in linear algebra without knowing how to program, and without knowing a single programming language.

For the second kind, most of beginner and intermediate linear algebra is computational, but not all. There's plenty of theory of linear algebra, and it has connections with representation theory, abstract algebra, as well as analysis, topology, and geometry. Study of infinite-dimensional vector spaces is purely non-computational.


One of my current favorite things: categorification of linear algebra via higher category theory: https://qchu.wordpress.com/2016/05/31/higher-linear-algebra/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: