The author is talking about databases, not programming languages.
I do see a different issue, though: The article indeed seems to make no distinction between an absent value and a default timestamp of 0. That limits your database to more or less "now". You cannot really store things about the past. Someone might take such a pattern and fixate it into some kind of library. If then someone else tries to store data from 1970, things can get ... interesting.
That does assume your database only allows unix-style timestamps. MariaDB, for example, has "datetime", supporting dates between the year 1000 and 9999, distinct from null/zero.
Unfortunately, datetime takes 8 bytes vs the 4 for a timestamp.
I do see a different issue, though: The article indeed seems to make no distinction between an absent value and a default timestamp of 0. That limits your database to more or less "now". You cannot really store things about the past. Someone might take such a pattern and fixate it into some kind of library. If then someone else tries to store data from 1970, things can get ... interesting.