THE MILLENNIUM BUG












The year is 99.
At least, it is if you're a computer...



















In actuality, the year is 1999. That might be confusing to a human, but to a computer those were the same number, at least when it comes to the date.

When computers were first invented, storage space was limited and expensive. Every bit of information counted, so corners were cut where they could be to save space. Since the first two digits of the year in a date was always the same, computer engineers chose to omit them in algorithms that stored the date.

You might notice the issue here. What about the years that don't start with 19? What about year 2000 and beyond? Would computers interpret the new year as being 1900 instead? What happens to computer systems that regulate things based on date and time when they're suddenly thrust a century into the past?

The potential problems caused by this issue became known as the 'millennum bug' or, more commonly, the 'y2k' bug.

Though, y2k doesn't just refer to the bug itself. It can also be used to refer to the fear and worry surrounding that time. There was a lot of sensationalism, which is understandable, considering it was a worldwide issue surrounding a relatively new field which the general public knew very, very little about.

Either way, the y2k phonomenon is a very interesting little blip in both world and computer science history.




back to home