Hacker News new | past | comments | ask | show | jobs | submit login
Using spaced repetition systems to see through a piece of mathematics (2019) (cognitivemedium.com)
160 points by sebg on Oct 17, 2023 | hide | past | favorite | 74 comments



If you want to see how some very simple notations greatly simplifies some math, check out J. H. Conway's proof of Morley's theorem.

Background: Morley's theorem is a non-trivial theorem in planar euclidean geometry stated in 1899 (first proof appeared 15 years later). The proofs are not easy. One can use complicated trignometry identities to prove it. Even the "simple" proofs are sometimes quite involved.

Conway introduced some notation and almost trivialized it. The notation he introduced was just a* := a + 60 where a is the degree of an angle. No one would believe this notation can do anything good, but with them (and some other insight) Conway can explain the proof in just a few sentences! (One might think anyone who understand that the interior angles of a triangle will always have a sum of 180° can come up with this simple proof, but that just didn't happen for 100 years until Conway revealed it.)

See page 3-6 here: http://thewe.net/math/conway.pdf


Michael Nielsen (with Andy Matuschak) also wrote about how Hindu-Arabic numerals enabled more powerful kinds of thinking (compared with Roman numerals.)

https://numinous.productions/ttft/#how-to-invent-hindu-arabi...

> ...the Hindu-Arabic numerals aren’t just an extraordinary piece of design. They’re also an extraordinary mathematical insight. They involve many non-obvious ideas, if all you know is Roman numerals. Perhaps most remarkably, the meaning of a numeral actually changes, depending on its position within a number. Also remarkable, consider that when we add the numbers 72 and 83 we at some point will likely use 2+3=5; similarly, when we add 27 and 38 we will also use 2+3=5, despite the fact that the meaning of 2 and 3 in the second sum is completely different than in the first sum. In modern user interface terms, the numerals have the same affordances, despite their meaning being very different in the two cases. We take this for granted, but this similarity in behavior is a consequence of deep facts about the number system: commutativity, associativity, and distributivity


Yep. Imagine doing multiplication in Roman numerals :)


I think that is partly why LLMs are bad at math and often fail at counting subsequences. Play with the tokenizer and you see long numbers are split into groups of 2 or 3 numbers.

https://huggingface.co/spaces/Xenova/the-tokenizer-playgroun...


Imagine designing and building an aqueduct with Roman numerals.



yes but they didn't use it to write down their plans.


You use a counting board rather than paper: https://www.youtube.com/watch?v=UVB27kKAcRs&t=111

The main advantage of Arabic numerals on paper have is that operations are non destructive and you can restarts a calculation if you lose your place. The main disadvantage is memorising the times table and the amount of scratch paper you need.


There's also many assumptions in the initial m^2 != 2n^2 proof that many would have a hard time instantly grasping. Problem is stated in three different ways, takes quite a bit of knowledge to instantly see they are the same problem.

The proof stops at 2n-m and m-n, subtly, because that is another smaller case of can two squares fit in a big square. If you do the expansion of 2(m-n)^2 and (2n-m)^2 to see if there's a fit, you see there isn't but the proof does not need to deal with that and to me the reason is very subtle, wordy.

Graph theory proofs feel very similar.


It's a wonderful proof, thanks for sharing, but the notation is really just syntactic sugar, no? It would have only been slightly clumsier to have "+60" everywhere. The real insight seems more to have done the construction "backwards", starting with the equilateral.


Syntactic sugar is powerful.

There was a time the China accepted calculus, but rejected Leibniz notations.

It looks this like: https://i.imgur.com/foUem6w.png

Edit: Technically this is still Leibniz notation.


The phrase "syntactic sugar" implies ease, but it's much more than mere "syntactic sugar" when the notation unlocks further information gain and cognitive advantage.

Cheers!


I like to keep all of my proofs very very very simple. 3 big steps, where each big steps should be intuitively true. Sure sometimes you should internalize the properties of your objects before you go on (big and useful theorems are just properties of the objects), but that's it.

I believe every natural problem has such a simple proof, that's why I use BFS when I try to solve a problem instead of DFS.


A similar thing also happens in the modern formulation of the Stoke's theorem. Once you set up the machinery of differential forms it becomes almost a trviality which is just amazing.


I am a strong believer in spaced repetition. My current primary focus is building a spaced repetition platform for learning poker solver solutions with the aim of winning money at live poker games ( www.livepokertheory.com ) . The motivation for the project is just to work on an indie hacker project that self-funds itself since I use my own tool to make money. My app is not based on Anki since , besides wanting to independently monetize my work , I also wanted to build a poker-centric UX for it.

Now, within the poker community there is an idea that “GTO robots” (players who memorize game-theory equilibrium solutions produced by solver software) are inferior to intuitive players who exploit their opponents mistakes rather than play an equilibrium strategy which is suboptimal vs a non optimal opponent. For example, a GTO strategy will bluff very aggressively which is not good against opponents that never or rarely fold. Especially GTO play is criticized in live games where there is other info such as opponent “tells” such as shaking hands when making bets.

But it’s a false dichotomy as I believe studying equilibrium gives you a strong baseline understanding of the mechanics and in live play you can still listen to your gut and make adjustments.

The OP talks about math and I’m talking about poker but I’ve seen this concept discussed in a few other places. Basically, some education systems way overindex on rote memorization. And just rote memorization is a really bad way to learn and understand since you miss high level reasoning, problem solving, and abstraction. But some rote memorization can be good as a foundation for abstraction. I’ve seen some studies suggest people who can do arithmetic and basic algebra more quickly can do higher level proofs more effectively. They are less “bogged down” with low level mechanics when doing higher level thinking. Musicians who want to creatively and improvisationally “jam out” might still drill scales to build that low level technique to unlock the high level creativity on top of it.

And within learning and memorization, spaces repetition works really well. It matches something with our brain chemistry to review mostly new stuff but occasionally review old stuff that might be fading, it’s the right mix of novelty and preventing forgetting. Anki is heavily associated with flash cards for language learning Vocab and MCAT test prep so nice to see some others exploring it’s possibilities outside of that.


“But some rote memorization can be good as a foundation for abstraction. I’ve seen some studies suggest people who can do arithmetic and basic algebra more quickly can do higher level proofs more effectively. They are less “bogged down” with low level mechanics when doing higher level thinking. Musicians who want to creatively and improvisationally “jam out” might still drill scales to build that low level technique to unlock the high level creativity on top of it.”

I personally find this to be true. I became significantly better at higher math and physics after I memorized and drilled on lower level concepts and algebraic/trigonometric manipulations.


This seems intuitively true but much more generally - not just for mathematics, and not just for spaced repetition. Our understanding of fundamental principles in any field is deepened by exposure to them generally in different contexts. For me (as a biologist/computer science/systems design person) the quintessential examples are evolution and entropy. Both are trivial in their essence, but infinitely rich in their implications.


Slightly tangential but are there any anki decks/csvs of definitions, theorems, axioms, terms Etc that start from basic arithmetic and get more advanced?

I have an idea that being intimately familiar with mathematical operations and ideas, as like one would with vocabulary and grammatical in a language, would help one internalize math better.


With math, the hard part is not to remember, but to understand it in the first place. Once you understand it, remembering is easy (because understanding is placing things within the whole context of your mind), but just being able to flawlessly recite the definition will not help you understand it at all.

I think a lot of people just cannot come to terms with the fact that mathematics is inherently difficult, and being good at it is not just a matter of memorization. For example, I often see people complain about mathematicians using arcane notation, as if it was the notation that was preventing them from understanding what’s going on, when in reality it is just lack of understanding of underlying concepts and experience with arguments being put forward.


The two are not independent - it is not a dichotomy.

When I did undergrad level engineering math, I pretty much never had to memorize. Just solve a lot of problems, and it's ingrained in your head. It helped that I utilize that material all over engineering and physics courses. Even now, over a decade since solving problems, I can still recall most of it and use it if needed.

Once I got to upper and grad level math, the approach of simply understanding and solving problems failed me, because you are usually not provided enough problems compared to earlier courses. It may have been good enough for the course to get an A, but not good enough to retain the material beyond the course. I've studied complex variables (Cauchy Goursat, residues, etc) at least 3 times. It's always a breeze, and it's always forgotten soon after. The same with statistics. With the latter, what did finally help me was spaced repetition. Even though it's been a few years since I last reviewed/relearned statistics, I can still read some material involving it and understand it. This is entirely because of those flashcards.

Understanding definitely has to precede memorizing. Insisting that memorizing is a sign of poor understanding, however, is parochial thinking. It's simply not true for the majority of folks.


Statistics and complex variables aren't taught in a way that gives you understanding, they are taught plug and chug. That is why it feels like a breeze and then everyone forgets everything. Basically nobody gets a good understanding for those subjects from taking courses, so you need memorization techniques for them.

Calculus and linear algebra are good examples of courses taught in a way that gives understanding, lots of people understand those after taking the courses, so there you shouldn't use memorization techniques.

So the reason you needed spaced repetition for statistics was that you never understood statistics, you just memorized it, like basically everyone else. It isn't wrong to do that btw, statistics is useful even when you just memorize it which is why it is taught that way, but don't trick yourself into thinking that you really understand the material the same way you did calculus or linear algebra.


Sorry, but no. Complex variables was taught just like calculus was, albeit with a slightly higher focus on proofs.

As was statistics. While this wasn't a measure theoretical approach, it was akin to how calculus was taught.


Much of what I know beyond the basic mathematics taught in elementary school I've learned on my own, and accepting this basic difficulty and learning to go slowly has been an absolute boon. I've simply accepted the fact that I might need to spend almost two weeks on a single proof or even definition, continually returning to it or ingesting it in different ways until I am ready to move on. I'm able to read a typical piece of somewhat difficult literature (fiction or non-fiction) in about two weeks, so it was certainly an adjustment, but as soon as I got used to this I found that my actual understanding of the concepts had increased tenfold.

I think part of the problem is that mathematics is so precise that subsequent understanding is tightly coupled to a comprehension of prerequisites. In so-called "softer" disciplines, the concepts are less precisely delineated, and their relationships are fuzzier. If you don't quite understand some prerequisite concept, you can sort of muddle your way through even the later notions that depend on it. Not so for mathematics. If you fail to get a complete and precise idea of the basics you'll eventually face complete doom or confusion when working through the subsequent portions of a theory.


> I've simply accepted the fact that I might need to spend almost two weeks on a single proof or even definition, continually returning to it or ingesting it in different ways until I am ready to move on.

This is exactly the correct attitude. There is no royal road to mathematics, it requires a lot of careful thinking to put everything together in your head. When I was studying mathematics seriously, getting through 4 pages of a textbook in an hour was a really good pace, but usually I only managed half or quarter that.

> I think part of the problem is that mathematics is so precise that subsequent understanding is tightly coupled to a comprehension of prerequisites.

100% agreed with this and the rest of your comment. I often see people pick up a research paper, try to read it with very little understanding of the prerequisite concepts it is built upon, try to look these up, which only uncovers prerequisites of the prerequisites they need to understand first, then despair, give up, and blame arcane notation, probably because the alternative is just too humbling to contemplate.


"One book opens another" - supposedly an adage from alchemical literature


The difficulty in understanding a proof I think is partly because whoever wrote the proof omitted many "trivial" steps in the proof. But they were trivial to the person who wrote the proof, not to whoever is reading the proof.

  To understand the proof you have to accumulate  enough mathematical knowledge that the trivial, omitted, steps in it become trivial to you as well.
This has to do also something with the fact that to understand the proof it cannot be longer than what fits into your memory. That is why proofs omit the trivial steps because that way mathematicians can understand the proof both in terms of remembering its outline, but also understanding how each step in it follows from the previous ones.


whoever wrote the proof omitted many "trivial" steps in the proof

This is not an issue specific to mathematics. Listen to a couple of surgeons talking shop in a hospital cafeteria. They're going to be using all kinds of arcane anatomical and medical terms the average person would have no hope of understanding.

Same goes for a pair of programmers talking through a complex bug deep within a massive codebase.

Economy of language is critical for communication among experts in a field. If everyone was forced to use the vocabulary of a 6th grader there'd be a whole hell of a lot of repeated explanations of concepts the other person already knows and it would just be frustrating.


True. I wonder if there was a way to study what are the hierarchies of concepts needed to explain higher-level concepts in any given field of study. That is not the same as "most often used" concepts, but really "most often assumed to be understood concepts".

Such an exposition might help students, if there was such an analysis of their field of study. What are the most useful concepts, including proofs and facts to know and understand to be able to learn more "higher level" concepts.


I don’t know about other fields of study, but such an exposition exists for math [1]. I heard about this book on the YouTube channel The Math Sorcerer [2].

[1] https://www.amazon.ca/All-Math-You-Missed-Graduate/dp/100900...

[2] https://youtu.be/ur0UGCL6RWc


I feel my comment is slightly misunderstood. I don't mean to say that strictly memorization is the path to mathematical understanding.

Besides brushing up on my math, starting from simple arithmetic, In my spare time I also study Japanese. And one of the things that has helped me the most in my fluency and understanding has been the memorization of vocabulary and of grammar patterns and their usage.

Of course, I read materials at my own level and listen to material at my level and above and practice writing. However, I noticed the biggest boost in my comprehension after I memorize a large amount of words or really internalize grammar patterns. And I do this mainly through flashcards.

I have to spend much less mental energy to catch on to what is being expressed allowing me the ability to potentially comprehend more.

And analogously to my language studies, I would like to approach math in a similar fashion.


What you are saying makes perfect sense in the context of language learning. I also found spaced repetition, Anki etc to be extremely effective for that purpose.

My point here is rather that this approach will not work with mathematics, simply because unlike language learning, which is mostly about acquiring and memorizing large amounts of simple X to Y mapping, mathematics has much less to do with memorization and more to do with building mental framework and placing new knowledge in appropriate places with in.


Understanding is only step 1. Step 2 after understanding is to practice until it becomes automatic and see it from different viewpoints, only then do you truly know it. That is what author of the original article is talking about too.


>Once you understand it, remembering is easy (because understanding is placing things within the whole context of your mind), but just being able to flawlessly recite the definition will not help you understand it at all.

I found this less and less true as the math I did got harder, hitting a breaking point when I took abstract algebra. The course just didn't make sense at all until I pulled a perfect score on my final by whipping out the Anki decks and drilling the theorem proofs and homework problems from class on a schedule. I would go right back to that method if I ever returned to mathematics for my master's degree.


If you just memorized the proofs, but cannot actually recreate them, it means that you did not actually learn it, and the perfect score on the exam doesn’t matter. The point of learning mathematics is to be able to transfer this skill into new domains, not to just regurgitate it.


Not necessarily. Remembering all of the multiple representations of the beta function, for example, is probably aided through the use of flash cards. You can still use such representations without necessarily having to go through and derive them from scratch, whilst still understanding what the beta function is. Ditto for the many trig identities.

Similarly, there are often underlying assumptions that can be tricky to remember in the moment, e.g. certain log laws only holding for the absolute value of the argument. There's a combination of both understanding a tool to begin with, and remembering various equivalences, representations, and underlying assumptions that makes math difficult.

Part of memorizing proofs is also just increasing your exposure to certain ideas, because maths is one of those subjects where there's no real substitute for time spent thinking about something (i.e. mathematical maturity).


>If you just memorized the proofs, but cannot actually recreate them, it means that you did not actually learn it, and the perfect score on the exam doesn’t matter.

I wasn't tested on the exact proofs and problems I memorized. I was tested on variations and novel combinations of them I haven't seen before. That sure sounds like learning to me.

Further you can't just memorize a proof straight through, there isn't enough space in your brain for that - rather the act of mentally walking through the theorem over and over via an Anki flashcard prompt will eventually just... Change your logic, invisibly, to be correct. Which, again, sounds a lot like learning to me.

>The point of learning mathematics is to be able to transfer this skill into new domains, not to just regurgitate it.

I am far more confident in both my intuition and conscious reasoning around e.g. Abelian groups or the enumerative combinatorics applications of group actions than whatever I learned in real analysis, where I studied in the "usual" way. Indeed going back to learn Haskell a few years after that AA course was much easier than earlier attempts because I had a considerably stronger background in what kinds of things to look for in that ___domain.

But more importantly homework problems are rigged [1] and transfer learning is close to non-existent in every ___domain we've seriously looked at [2], so this is awfully close to moving the goalposts on what "really learning" something is by setting an unreasonably high bar to start with. Math certainly can transfer to new domains, but I would never call that "the point" of math, and that's also a totally different endeavor to be performed in addition to learning the math itself.

[1] https://www.johndcook.com/blog/2023/10/12/homework-problems-...

[2] https://www.econlib.org/archives/2012/08/low_transfer_of.htm...


> But more importantly homework problems are rigged [1] and transfer learning is close to non-existent in every ___domain we've seriously looked at [2],

Indeed it is, but that’s because most people just learn to pass exam by redoing the same exact problems with different inputs! Bringing up this fact does not support your argument in favor of memorization, instead it is closer to my view, which is that most of schooling is just cargo culting education, people just memorize the exam problems, pass, move on and forget. What’s the point of the whole thing in the first place if you can’t transfer?


What I'm actually getting at there is your standards are inhumane. Transfer is pretty poor across the board in pedagogical studies and we don't know how to reliably get more of it. Indeed it's a tough thing to even rigorously define, since it's basically creativity finetuned on crystallized intelligence. You might get more of it out of people by massively upping the difficulty and number of homework problems. That's a huge cost to put people through, especially when a significant proportion of them really do just want to study the thing for its own sake, and couldn't care less about something as nebulous as "transfer". I don't need my knowledge of the Kan extension to have to inform how I play tennis.


Understanding is remembering, remembering is understanding.

I seriously don't get why people keep separating the two


Because machines can remember without understanding and do just fine solving just about any problem you might find in undergrad math? A calculator can find the square root of 7 without knowing what a square root is, or what a square is, or what a number is.

People who have a knack for memorizing long lists of arbitrary if-then tables can excel in rote mathematics (up to say multivariable calculus) without needing a philosophically deep understanding of what's going on, for the same reasons.


What does it mean to "understand" the square root?


I can't speak for everyone because I don't know that there is a universal axiomatic understanding, but one way to "understand" finding the root of a given quantity would be that you are peeling off a dimension from a base unit (of, say, area) to arrive at a lower dimensional base unit in the same numbering system.

Another aspect of understanding is _why_ you are doing this, where does it fit into the programme of necessary compression of infinite information density (i.e. the number line is infinitely "dense") so it may be accessible despite the material confines of a human brain and its limited, discretizing capacity for dealing with multiple elements in a single operation. So, philosophically, different lower dimensional spaces integrate to form higher dimensional spaces, in order to facilitate the description of change from one thing into another thing. One linear dimension changing into another linear dimension requires a transition through a quadratic space, from which we get a curve.


You connect it to your intuition. Square root is the side of a square with that area. You can memorize that description, which doesn't help. Or you can connect that to your intuitive understanding so that your intuition understands it, then we say you understand.

A good example is velocity. Many people who passed math classes can't answer "how long does it take to drive 80 miles if you are going 80 miles an hour". Such people never understood velocities, they just memorized some rules. Memorizing rules wont help you solve that question well, you need to make your intuition understand the relationship between velocities and distances.

As you learn more you start to build a net of intuitive connections between all the things, that should be your goal when learning these things. That net will last you a lifetime. Word based or symbol based memorization is mostly worthless in comparison, doesn't help you apply it to other subjects and takes more effort to build.


> Such people never understood velocities, they just memorized some rules.

These are the ones who always express a special dislike for word problems.


You did not answer the question


Because they're different things? There's obviously interplay, but from my experience...

I have only vague recall of most CLI things, but I can get up to speed again very quickly (when a situation demands it) because I've outsourced remembering arcane command line options to "man" and just need a system with docs installed and remember(!) that there are commands called "cat", "ls", "awk", etc.

A similar thing applies with math... the analogy strains a bit, but "cat", "ls", "awk", etc. are the 'understanding' which underpins everything else. I did a recent thing with force calculations and symmetry which apparently impressed some of my engineer friends' colleagues... but the details are unimportant. Just knowing what a sin/cos curve looks like and what an integral fundamentally is got me there... but the fundamentals were enough to get there, is the point :)


Can you demonstrate how to understand and remember word genders in gendered languages, for instance? A table is masculine in one (German) and feminine in another (French).


Language learning is a perfect example indeed.

You memorize so much stuff that at a certain point the language "clicks" and you can infer meaning and rules even if you don't know them yet.

Remembering is understanding, understanding is remembering.


My mathematics anki cards are here: https://ankiweb.net/shared/info/1715995729?cb=1697579387129

They are organized by category with a focus on fundamentals.


Thank you. I use flashcards to study for Japanese and I'd like to use flashcards to study for math. Thanks again


Here's a different proof of the identity just thinking of M, M* as operators (not as matrices). So, the invariant way of 'diagonalization by a unitary matrix' means that we have to find an orthonormal basis of eigenvectors. We can always find one eigenvalue λ and the complex numbers. The strategy is to choose one eigenvector v and say that its complement subspace is also preserved by the action of M. This complement subspace, being of lower dimension, already has an orthonormal basis by induction and this basis extended with v will be the required basis. This strategy can be done in a straightforward way if we assume M is hermitian M*=M (or symmetric matrix for real numbers). <v,w>=0 ⇒ <v,Mw>=<M*v,w>=<Mv,w>=0. So, M preserves the complement and the matrix is diagonlizable. This is the Spectral Theorem.

For a normal matrix, we have to choose the vector v which is simultaneously an eigenvector for M and M*. Then the proof still works as <v,w>=0 implies <v, Mw> = <M*v,w> = λ'<v,w>=0 where λ' is the eigenvalue of v with respect to M*.

We can find such a shared eigevector v as M, M* commute implies that M* will preserve the eigenspaces of M.


The chunking concept really resonates with me. I have been a speedcuber for years (solving Rubik's Cubes really fast) and I can remember Rubik's Cube solves without much effort. If you gave me the scramble of my personal best solve, I could reproduce the same solution.

In some sense it gives me hope that I can also reach the same level of proficiency in more productive areas.


Can you explain for those of us who aren't speedcubers what the scramble is? Is it what it sounds like - a number of pseudo-random twists to undo the solve?


To make an "official" scramble, first they create a random state, which basically means random permutation and random orientation. Then they find a sequence of twists to arrive at the random state.

On the contrary, applying random twists will not give you a truly random state. Some states are more likely than others to be achieved when applying random twists.


Have you tried using SRS for anything productive and failed or just haven't gotten into it yet?


A while back I used Anki to learn a bunch of French vocab. At the time it was great and I learned a lot. I eventually stopped and I've probably forgotten most of it.


Michael Nielsen and Andy Matuschak wrote Quantum Country (https://quantum.country/), an introduction to quantum mechanics, with integrated spaced repetition questions. Andy develops Orbit (https://withorbit.com/), allowing other authors to do the same.

I agree with Michael that a lot of the value comes from formulating your own questions based on the source material, but I'm also curious to experiment with author provided questions.


In my humble experience, you gain a LOT from writing the questions yourself, I point that I'm not sure if the author is stressing enough.

There's also the problem with cards that are difficult to memorize, which is something that you don't know in advance, and that's a good thing! It shows you were you should pay more attention, write cards in different forms, etc.


I highly recommend Andy Matuschak’s guide on how to create good prompts for anyone looking to get serious about spaced repetition: https://andymatuschak.org/prompts/

Andy and Michael Nielsen have collaborated in the past on things like quantum country, as others have noted.


I'm curious if anyone here has experimented with Anki in collaboration?

As the author of this post notes, and I concur, others might not come up with his questions. I read the discussion he presented and thought to myself, I doubt I would have created the set of cards he came up with because my matrix memory is weak.

I've had great experiences with pair programming and amplifying my learning software development. Feels like this could work with anything and Anki.


Not really. There are options for sharing cards on Anki https://www.reddit.com/r/Anki/comments/14j2jfy/deck_sharing_... but their collaboration features are limited.

I myself am building an Anki clone https://github.com/AlexErrant/Pentive with collaboration built in as a first class citizen, though its far from primetime. Currently stewing on how to get the SR algorithm, FSRS, to compile to wasm.


In case you are looking for the essay on the = sign, this is what Google Bard said:

The reference to the essay by Andrey Kolmogorov in the paragraph you provided is:

Kolmogorov, A. N. (1930). О понятии числа [On the concept of number]. Matematicheskiy sbornik, 38(3), 3-19.

This essay is in Russian, but an English translation is available in the book "Selected works of A. N. Kolmogorov, Volume I: Mathematics and mechanics" (1991).


Do you think it's true? I got my hands on the pdf see no trace of it in table of contents or text searches.


Could it be the author isn't Kolmogorov, but Kronecker?


A good question. I've found Bard to be quite good at this type of question in the past but maybe it has hallucinated the answer!


  > Perhaps it is the article On Sense and Reference by Gottlob Frege?
  >
  > http://www.scu.edu.tw/philos/98class/Peng/05.pdf
from previous discussion https://news.ycombinator.com/item?id=18895613


ChatGPT thinks it might be Paul Lockhart's essay, titled "A Mathematician's Lament".


> Kolmogorov discussed this in loving detail, and made many beautiful points along the way, e.g., that the invention of the equals sign helped make possible notions such as equations (and algebraic manipulations of equations).

And then programmers came along and mangled the mathematical notion of the equals sign with the imperative notion of assignment. And here we are. Using = for assignment when we ought to have been using : instead, if we were to rely on people’s intuitions from reading and writing.

Which of course is difficult now that : has been co-opted by type systems.


In Smalltalk '=' means equality, and ":=' means assignment.


They realized so many things!


Tangential, but I think people tend to underestimate just how important and deep mathematical fundamentals are in general. The relatively "simple" notions on which we build all other mathematical concepts, such as equality, relation, function, symmetry, associativity etc. actually have incredible depth to them. Too often, I think teachers and texts gloss over these notions instead of plumbing their philosophical depth and implications (statements like "the proof falls out automatically" or "the proof is obvious" or "the proof is a natural consequence" are a real disservice to learners). Both set theory and category theory are illustrations of just how far you can go with a few of these ideas (category theory, for instance, is basically entirely built up around the notion of a function and only two other ideas, identity and associativity—yet these three "simple" ideas alone are enough to construct a theory rich enough to model all the other branches of mathematics!)

Understanding mathematics is really understanding how to start approaching metamathematics. What does it mean to distinguish one class of things from another? What does it mean to say something has a property? That a property is reflected or preserved? These are the sorts of questions that are fruitful. It makes sense that the spaced repetition approach can help because it might necessarily force you to continue thinking about (and thus questioning) a concept you'd otherwise have taken for granted or not plumbed to sufficient depth. Too often, we fail to recognize how complicated "simple" things really are.


> Too often, I think teachers and texts gloss over these notions instead of plumbing their philosophical depth and implications

I have a few questions:

What level of teacher?

Are you a maths teacher/teaching academic?

My understanding of Maths education in the west is that they essentially spend the whole time attempting to get children to the point where they can do precalc and that any attempt at reform is eventually defeated by conservative engineers running a concerted political campaign to convince the voting public that set theory and category theory are a leftist conspiracy.

My experience of university mathematics was that set theory was assumed knowledge (fair, it has managed to stay in the curriculum), same with category theory (lol).

I would say that if it weren’t for the rise of the computer and the prevalence of relational databases in the late 20th C they would have managed to cut all non-precalc out entirely.


Many people lack basic "numerical literacy". They cannot understand different magnitudes, percentages, probability and most importantly LOGIC.

If you only understand 0, 1, 2 and MANY, you can be led astray with all kinds of politics, and advertising.

Gerrymandering, I think we should teach people how that works, and leave Category Theory to those who are truly interested in math for math's sake.


For people who don't want to follow the author down the unnecessarily loquacious dive into matrices etc., but like me are interested in techniques for self-learning, here is a wiki article on what spaced repetition is https://en.wikipedia.org/wiki/Spaced_repetition

He said he uses this application to do spaced-repetition learning https://apps.ankiweb.net/




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: