Of Millennium Bugs and Millennium Myths

“We have uniformly rejected all letters and declined all discussion upon the question of when the present century ends, as it is one of the most absurd that can engage the public attention, and we are astonished to find it has been the subject of so much dispute, since it appears plain. The present century will not terminate till January 1, 1801, unless it can be made out that 99 are 100… It is a silly, childish discussion, and only exposes the want of brains of those who maintain a contrary opinion to that we have stated”

The Times (London, Morning Edition, December 26, 1799)

It has been just so in all my inventions. The first step is an intuition – and comes with a burst, then difficulties arise. This thing gives out and then that – “bugs” – as such little faults and difficulties are called – show themselves and months of anxious watching, study and labor are requisite before commercial success – or failure – is certainly reached.

Thomas Alva Edison (Letter to Theodore Puskas, November 18, 1878)

Introduction

As everybody probably knows, the “Y2K” problem is what will happen to some poorly-designed (or very, very old) systems which only use the last two digits of the year to perform their date logic, and which fail when performing certain calculations which involve years after 1999. I don’t have a problem with “The Year 2000 Problem”, and I am not going to add my voice to the countless speculations of how much it is going to cost to fix this, or the never-ending lists of the different things containing one or more silicon chips (even if they do not use any date functions), or the frightening predictions of the catastrophes that will result from our dependence upon computers. Being a bug fixer by profession, I would however like to uncover some of the myths which are increasingly surrounding this issue. Some of them are so frequent and so misleading that they could proudly top the list of this millennium’s urban legends.

Not a Millennium Bug

Since this issue is mostly related to the use of two digits to indicate the year, it occurs every 100 (not 1000) years. This means that the “millennium bug” is actually a… century bug.

Good Excuse for a Party

No doubt that the year 2000 is related to this issue, and that there will be some big parties at midnight, December 31, 1999. But this date marks only the first day of the Y2K problem, not the beginning of a new millennium.

Just to clarify: we are talking about the system used to date years as per “Common Era”. I prefer not to relate this to any particular religion, and those who have interest in these matters probably know that Dionysius Exiguus, based on the literature available today, was wrong by several years when he made his calculations, so there appears to be no particularly well-known religious or historical event that occurred seven days before the date of 0001-01-01 as per ISO 8601 standard. Which illustrates an aspect that appears to be confusing to many: the first year was year 1, not year 0. And the year before that was year “-1” (or year 1 Before the Common Era). All history books which use Common Era dates report events in this way. Introducing a year 0 would require that all dates before 0001-01-01 be changed with respect to how they are documented now. This is unlike how we refer to aging: when a baby is three months old, that’s zero (not one) years and three months. Months, days, centuries and millennia also start counting from 1, and we consider that normal, don’t we?

It should therefore not be difficult to accept that the first millennium of this calendar system  began on January 1st of the year 1, and ended on December 31 of the year 1000. There was no “year 0” just like there was no “century 0”, which is why we refer to years in the range 1901-2000 as being part of the 20th century, and not the 19th century, which has never been the object of any doubts or discussions. Accordingly, the third millennium (and not the second millennium) will start on January 1st, 2001 (not 2000), which is also the first day of the twenty-first century. By this definition, the millennium bug is neither a millennium bug nor a century bug, as it is not related to the beginning of any century or millennium, unless we want to consider the end of any arbitrary range of 100 or 1000 years a good excuse to party.

Actually, I am told that 100 years ago they had two big parties: one at the end of 1899, and one at the end of 1900! In both cases people celebrated the beginning of the… 20th century.

Bug of the Millenium

“Millenium” written with one “n” is probably one of the spelling bugs of the millennium. It is neither English nor Latin. With an accent, it could be French. Yet if you do a site search for “millenium” on sites like microsoft.com the hundreds of pages your favorite search engine will find are in… English.

Nothing New

Just like the origin of the term “bug” to define a technical problem is not related to some moths which indeed were found inside the Mark II computer in 1945 (both “bug” and “debug” were in use long before that, as several dictionaries and writings confirm), the “millennium bug” has little to do with computers and programmers.

If I look around me, I can see dozens of non-computing examples of this issue, but I cannot remember having ever read about how much it costs, each decade, century or millennium, to fix them. As I write this, I have on my desk a booklet of blank forms, first copy white, second copy pink, from my grandfather’s company, in which the date field was introduced by a pre-printed “195_”. In 1960 the forms had to be converted to notepads, and are still in use after 50 years. A few days ago I read on a local newspaper about a funeral parlor that still has hundreds of gravestones with “19__” pre-sculpted. My grandfather did it with his forms, and IBM did the same with its inventory of punched cards after magnetic media became the preferred medium, but I wonder what will happen next year to these stones… Will they put them in their office, next to the telephone, to quickly engrave some notes?

Looking back at old writings, even of several centuries ago, one will notice that years are often written as two digits. Using two digits instead of three, four, five, or whatever is necessary to fully express a date, does not only save bytes in computer storage (by the way: using two digits instead of four saves one byte, not two, since a byte can store two decimal digits) or two precious columns on a punched card, but is more convenient to write. The use of two digits instead of four has always been common even in printed material, where the advantages are considerably more limited, and measured in very small amounts of ink and paper, since the work is done by a machine. This would seem to be a strong confirmation of the fact that the issue is much more about general culture and habits, rather than specifically a computing choice.

And if writing were not enough, even our oral culture reflects this tradition: we say “sixties”instead of “nineteen-sixties”, and in many languages when recollecting events and personal experiences only two-digit years are used (in some languages the “nineteen…” or other century prefix is important to recognize a date context). Two-digit numbers are also used when talking about car models and software products up to the year 1999. Wherever people talk, whether in the streets or on television, this is the language that is most frequently used.

Conclusion

It is my opinion that, in an ideal world in which everybody consistently used four digits to write the year, and except for a few extreme and now extinct cases like those obsolete punched cards mentioned before, which could store only 80 characters each, programmers would never have been a significant exception, and we would now not be talking about a “millennium bug”. Even in our imperfect world, good teachers told their students to watch out for this problem several decades ago, and good engineers and programmers planned ahead and avoided the problem altogether without anybody having to tell them so.

In theory, if we learned from this experience, date-dependant code would from now on have to be written so as to allow for the representation of infinite time. Well-written specifications should take into consideration that four digits is the minimum to be used for dates, and not a constant. After all, we’ve already used up 20% of the range of years which can be defined using four digits. Anyway, I am sure that within the next few decades computers will become smart enough to work around these human limitations, and fix our bugs.

To be honest, there are a few places in which I wouldn’t want to be if a problem really emerged at midnight, December 31, 1999. A plane and an intensive care unit come to mind. But hearing of people stockpiling food, gasoline and money makes me wonder more about our insecurities in Y2K society than about possible software problems. About the “bug” itself, I do remain an optimist, and I am sure that when confronted with the actual problem this issue will be solved surprisingly quickly. In January, 2000, for all the organizations involved, the “millennium bug” will ultimately become a “fix or die” issue.

As they say, time solves everything…

 

P.S.: Speaking of calendars, wouldn’t you love a year consisting of 13 perfect 4-week months (13×28=364, plus the usual leap day every four years)? The idea is not new, but it somehow got lost a few millennia ago.

 

© 1998, 1999 Mike Battilana