This seems to only really work in languages that allow null variables/timestamps. I wouldn't really want to have to do comparators to the default value of a timestamp.
The author is talking about databases, not programming languages.
I do see a different issue, though: The article indeed seems to make no distinction between an absent value and a default timestamp of 0. That limits your database to more or less "now". You cannot really store things about the past. Someone might take such a pattern and fixate it into some kind of library. If then someone else tries to store data from 1970, things can get ... interesting.
That does assume your database only allows unix-style timestamps. MariaDB, for example, has "datetime", supporting dates between the year 1000 and 9999, distinct from null/zero.
Unfortunately, datetime takes 8 bytes vs the 4 for a timestamp.
Assuming the timestamp represent a change of state in a contemporary application, I would expect 1970-01-01 0:00:00Z (UNIX epoch +0 seconds) to be unambiguous enough (But that's definitely an engineering constraint to maintain awareness of)