Hacker News new | past | comments | ask | show | jobs | submit login

A good example of this is the natural numbers. Algebraists usually consider zero to be a natural number because otherwise, it's not a monoid and set theorists want zero because it's the size of the empty set. My number theory textbook defined natural numbers as positive integers, but I'm not entirely sure why.



> My number theory textbook defined natural numbers as positive integers, but I'm not entirely sure why.

Since both the inclusion and exclusion of zero are accepted definitions depending on who’s asking, books usually just pick one or define two sets (commonly denoted as N_0 and N_1). Different topics benefit from using one set over the other, as well as having to deal with division by zero, etc. Number theory tends to exclude zero.


> commonly denoted as N_0 and N_1

Oh my, it had never occurred to me that one could disagree, not just about whether the natural numbers include 0 or don't, but also about how to denote "natural numbers with 0" and "natural numbers without." Personally, I'm a fan of Z_{\ge 0} and Z_{> 0}, which are a little ugly but which any mathematician, regardless of their preferred conventions, can read and understand without further explanation.


Yep, lots of ways to denote these sets. It’s not a disagreement but rather a preference (although certainly some folks will gladly disagree).


Number theory includes zero as the identity element for addition, much as 1 is the identity element for multiplication.

I am totally assuming you knew this already.


For the sake of making an easy transition to the monoid, yes. Personally a fan.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: