For a long time now (since before Y2K) if I've been using YYYYMMDD (zero-padded) as an internal text format for dates (in places where it's more important, for example, to relate to human-readable file formats than to be efficient with the data storage aspects.
Has an advantage that it can be text-sorted, by a many of the simpler tools I might have to use externally, and I can do things like put DD="00" for "start of month" and DD>"31" for the end, and similar with months, where certain information is indefinite.
Of course, I always need to convert to/from the current system/programming language's implementation of dates, and be aware of the era-start (e.g. 1900), and magnitude (1 unit is a day, or a second?) of the system/language supplied data, etc, but apart from being a bit blind to BC dates and having a Y10K problem I think it works nicely. And can be appended with "-hhmm[ss[decimals]]" for a full date-time of the appropriate resolution (still sortable).
Again, not data-efficient, but but can be squashed into optimally utilised bits a number of different ways, while still retaining the precision and range.
(If there's an extensive list, I tend to go for stating a baseline value, in full-precision, at the beginning of such a list and then reducing the rest to offsets in a suitable variable bit-length form biased towards efficiency for small differences.)
Not that any of this helps if I don't document the format for future users (or myself), of course. Which is something that was the problem with most of the Y2K coding dilemmas (the ones trying to check/update the code virtually having to rewrite it all because it was too obscure to certify or amend). Well, from my perspective, it was.