When was bug last programmed in?

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

I gather that the Y2K bug was first introduced into software in the early 60s -- correct me if I'm wrong -- but my question is, when did the two-digit year field stop being the industry standard? In other words, when did people wake up to the problem and stop introducing it into their programming? If anyone has a definitive answer to the question, I'd like to know about it, with, perhaps, some sourcing. Thanks!

-- Erik (floete@aol.com), October 21, 1998

Answers

As far as I know there is still no industry standard.

It's quite possible to continue using 2-digit dates, if your data does not have anything approaching a century-wide span of possible values. This is called windowing. For example, +/- 50 years with respect to current date: 98 means 1998, 01 means 2001, today. No problem until 2049, say, when any 98's left switch to meaning 2098. Your design must make sure that any such then-historical date data is eradicated by then.

This may not be a problem for (say) retailers: orders are rarely placed for more than a year ahead, historical data is not needed for more than seven years back either for legal reasons or for business planning. The Y2K bug in this context is a programmer who got the windowing decision wrong, but the error won't show until the clock wraps or the first dates in 2000 get entered.

That aside, when is 03071999 ? In the USA it's probably 7th. March. Here in the UK, it may well be 3rd July.

And then there's the "Unix Millennium"; UNIX and UNIX-derived codes often store dates as a count of seconds since 1970. Around 2038, this 32-bit count wraps around ... but by then I think procrastination will be out of fashion!

-- Nigel Arnot (nra@maxwell.ph.kcl.ac.uk), October 21, 1998.


Nigel is right. It is still legitimate to use 2 digit years if you window the ranges. There is no standard. At my company, all new applications store digits as 4 digit years (or a logical equivalent), but may use 2 digits on reports and user interfaces.

At a previous company, I coded 2 digit years as late as 96. The whole application had 2 digit years, so what did it matter to add 1% more? Also, remember that it was never a matter of "waking up" - the problem started as an conscious decision by early developers to save disk space, later it was just laziness, but everyone has always known it was a problem. I asked my employer at that time (an HMO in Texas) what he was going to do about Y2K. He asked what I thought it would take. I told him that if I started that instant, I might be able to fix the system by 2000. I was 1 of 2 programmers. I heard he has 8 now.

At another company, I coded 2 digit years as late as 94 - again, half the app depended on it. I could have expanded the dates on my half and prepared, but I didn't.

Some Microsoft products STILL have "minor issues" - so they have been introducing it (or at least not finding it) as late as this year, 1998 (just check their website).

There are no standards in this industry like in medicine or engineering - there are probably people that are introducing Y2K problems right now.

I distinctly remember a conversation with my sister in law about Y2K in 1995. She said "I'm sure they've thought of that!" I said, "I'm a programmer, and I'm telling you that they certainly haven't." At least not enough to correct it.

-- Ray Givler (Dont@mail.com), October 21, 1998.


I left my last permanent job in 1997, remember the latest y2k manager constantly complaining that other teams were introducing non-compliant code at that time. All new code should have been compliant from 1994 or so.

-- Richard Dale (rdale@figroup.co.uk), October 21, 1998.

While the original goal of saving memory was a worthy one given the infancy and cost of the technology, and while some 'laziness' and 'lack of foresight' haved no doubt played a part since that time, these were not the only contributing factors.

One very significant reason why two-digits dates didn't change is that large, complex software systems are frequently 'evolved' with new 'additions' being 'tacked on' to existing functionality.

For most programmers hired to 'build a new addition', significantly modifying the existing 'structure' was simply not an option. The costs of doing so, as we are plainly seeing now, would have been prohibitive. In large projects, programmers (or groups of programmers) are (or should be) given very explicit written directions (a requirements specification) that that state precisely how a given 'chunk of code' (module) that they are tasked with writing must 'speak to' (interface with) the rest of the existing system. Such specifications frequently contain two-digit dates.

A good analogy here would be if you worked for a railroad construction contractor and had been hired to add on a small 10-mile section of track to an already existing 2,000 mile network of tracks. Your section of track must necessarily adapt to the existing network. The scope of the project and the money budgeted to it would not have allowed you incorporate any features into your 10-mile section which required massive work to be done on the existing 2,000 miles of track. Any engineer (or subcontracting firm) who insisted such work was necessary would have quickly been replaced by an engineer (or subcontracting firm) who was willing to build to the existing specifications.

Another related factor in this situation was the motivation to maintain something called 'backward compatibity' -- that is, the ability of the 'new system' to continue working with all the 'old systems'.

Your PC is a good example of this. What if I told you I had written a brand new operating system called "Doors 98" and that it was not only totally Y2K compliant but had been independently verified to be nearly 100 times faster than Windows 98? In addition, I'm selling it for $15 a copy. Sound pretty good? What if I then told you that while all this was true, you would need to replace ALL your existing software (word processor, spreadsheet, browser, database software, etc, etc.)?

Backward compatibilty is a major force that drives the markets and any company which does not understand that won't last until the end of the week. Thus, even if your totally new 'system' uses four digit dates internally, it had better be able to work in an environment of older systems that communicate with each other using four-digit dates.

The driving factors for the things I've mentioned here were largely economic in nature rather than technical. These things, and more, contributed to the situation we now face.

Arnie

-- Arnie Rimmer (arnie_rimmer@usa.net), October 21, 1998.


The company I mentioned was doing mostly y2k windowing, not 4 digit years, it would have been easy to do this from the start. Where long term dates (eg 30 year life policies) were required 4 digit years were designed in. I've heard that some other old insurance systems used a format like CYYDDD or "days since 1900" to hold the dates on file - 6 digits.

-- Richard Dale (rdale@figroup.co.uk), October 21, 1998.


Think about your question - but to quibble slightly,

Nobody ever programs "in" a bug, the d**m things happen by oversight, improper or incomplete testing, or by simple goofs.

But also, it is a bug only it causes the original (or if exchanging data, the exported data) to fail or give unexpected/unacceptable results.

There is no definitive answer, and it is absolutely incorrect to say "If this was written after Jan 1994, or May 1996, or any other time. Every program is unique, writtenby unique individuals. To come close to assuring performnace, you can only test it. Then the only thing you know is thath the program works under the test conditions.

For anything past that, we (as users) must rely on your judgement and expertise as a programmer. Nobody else can find it, fix it, and "re-fix" it. Keep in mind, "new" Y2K bugs are getting introduced now, due to inadvertant errors and unknown interfaces in the updated programs.

So I mould say the last Y2K bug hasn't been written yet, and I would treat skeptically and test any program written after about June 1862.

-- Robert A. Cook, P.E. (cook.r@csaatl.com), October 21, 1998.


Two digits IS STILL THE STANDARD. I'm working on a net, programming in MicroSucks Access (97). The default for short and medium dates is 2 digits. Only if you go to long dates "Thursday, October 22, 1998" do you get 4 digits. So, I set the Windows and Access defaults so I would get 4 digits in the short and medium dates. Worked great. But next day I booted up, and the network (revenge of the mainframers) pukes had wiped out my settings, reverting back to 2 digits. (And then I had to go back and modify what I did the previous day to work with 2 digits.

By the way, when are people -- worldwide -- going to start using the ANSI/MIL std date form YYYY.MM.DD or YYYY-MM-DD instead of the various formats such as m/d/yy d/m/yy etc.?

-- pc (pcguru@wearedoomed.com), October 21, 1998.


Heck, the PC that you might buy today would probably have a non-Y2K compliant real-time clock (RTC)! This Y2K "flaw" is astoundingly pervasive, and is what makes it so difficult to grasp. (Generally, with PCs, as long as the BIOS is Y2K compliant, the RTC is a non-issue. But, there are certainly exceptions.)

-- Jack (jsprat@eld.net), October 22, 1998.

<< Two digits IS STILL THE STANDARD. I'm working on a net, programming in MicroSucks Access (97). The default for short and medium dates is 2 digits. Only if you go to long dates "Thursday, October 22, 1998" do you get 4 digits. >>

Don't confuse the visual representation of data with it's internal storage. It is only the internal storage version that counts. "10/22/98" and "October 22, 1998" are different visual representations of a single data item stored in the database. I don't know the exact way that Access stores dates, but it probably matches neither of these two visual representations. If Access stored the date as "19981022," it could still display the date in either of the two formats described above.

-- Paul Neuhardt (neuhardt@ultranet.com), October 22, 1998.


To PC Guru -

You asked "When will the world go to the ANSI/MIL-STD standard of ...?

It's be nice if "they" would, but why (other than convenience) should they? Those two standards have "US" written all through them, even in their abbreviations (American National Standards .. and national sponsors/maintainance groups. Those aren't maintained or even recognized/mandated through ISO, or there would be some incentive (like ISO 9000) to get a semblance of order out there.

So there is no technical reason (or legal reason) for the world to use that convention. Unfortunately.

Each company probably will just keep on "makig do", individually exchanging (rewriting) each database as it comes in to the "local" format.

These whole Y2K troubles will probably lead to some "painful" software reorganizatin and regulation - probably on the order of the boiler explosions/insurance/civil engineering fiascos of the 1800's that led to the requirement for buildings to be inspected and the drawings certified before construction.

-- Robert A. Cook, P.E. (cook.r@csaatl.com), October 22, 1998.



for PC and Bob Cook. i understand that the yyyy.mm.dd format is recommended in ISO 8601. there's an interesting article about this in QST magazine in their digital section, several months ago.

-- Jocelyne Slough (jonslough@tln.net), October 26, 1998.

Thanks, I'll go through some back issues to find it. Surprising, isn't it that the ISO/QSO and the "national" quality consultants and guru's aren't talking about the subject very much.

-- Robert A. Cook, P.E. (cook.r@csaatl.com), October 26, 1998.

Moderation questions? read the FAQ