Y2k - The fault of designers?greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread
Surely it would have been possible for programmer/designers to have used a compressed date format for storing dates which would have satisfied both the demands for limiting memory usage as well as storing date values for indefinite use. Some systems did use this format, especially where it was known from the outset that dates would project well into the future, for instance long term life insurance policies. An example of this is storing the date as 'the number days since 1900'. Common routines could be used to process and translate the dates to readable format for display/printing. A date time span of over 27,000 years could have been stored in 4 chars packed decimal format, instead of a 6 digit MMDDYY. It was the fault of bad design. It just required a slightly more intellectual approach to design. I have seen many date routines, it was something programmers never really got to grips with. Best Regards Richard
-- Richard Dale (email@example.com), July 07, 1998
Here we go again. the answer is of course yes: 6 bytes for a date is 48 bits, and 2**48 seconds is a time interval of geological proportions, let alone 2**48 days.
However, if you go back to the original computers, you'll see that computational speed was very much an issue. Any extra instructions required every time a date was processed or (especially) output would have been a very significant cost. The fact that YYMMDD (or permutations thereof) could simply be copied to the printer with no prior manipulation was a significant advantage. This is assuming that it was discussed at all: quite probably experience of written dates made the choice completely "obvious".
As has been much noted, in those days no-one believed that the computers or the programs would be in use for more than a decade, so the year 2000 did not feature in many considerations. If it did, it was perfectly reasonable to assume that it would be fixed in good time as part of normal software maintenance, probably in the 1980s, if the program was still around.
That last bit was the big failure, and it was a management failure not a programmer one. Instead of doing the maintenance, the managers in the 1980s were busy "downsizing", stripping their companies of the accumulated knowledge of the folks on whose work the business depended. The mainframes were left to run on auto-pilot. The programmers (pilots) were jettisoned as dead weight. The managers assumed if they ever needed pilots again they'd hire them...
Which is where we are now.
However, don't be TOO hard on the managers. Maybe they deserve to loose their massive salaries and their executive perks, but as I've posted in another thread here they had other sensible concerns. Had they budgeted for Y2K remediation in 1980 or 1990 the costs would quite probably have guaranteed that their organisation went out of business well before 2000, with the more "efficient" company that did not remediate out-competing them.
It's a *systematic* problem and nobody (or everybody) is to blame. Our society is intrinsically incapable of taking a long-term view of things, and now we face the reckoning for that fact.
And please: next time you see someone whipping up hatred, remember your history. You are looking at a wannabe-tyrant. If you feel an overwhelming desire to shoot someone, make it him, not his scapegoats.
-- Nigel Arnot (firstname.lastname@example.org), July 07, 1998.
This is in reply to Nigel Arnot: "Our society is intrinsically incapable of taking a long-term view" Society consists of people, the people is this case were the designers, I/O overhead considerations have not been a problem since the mid seventies, the designers were incapable of taking a long term view. Systems written in the 1990s are non-compliant. "Mainframes left on auto-pilot" the belief seems to be that systems are created to perpetuate the employment of their creators, so many systems were so bad that they required years of maintenance to put them right, the only people who benefited were the mediocre progs who created them. "Budgeting for Y2k in 1980/1990" No-one was aware of the problem, you would have been laughed out of court as I was even in 1992/3 when I did the initial y2k study for an insurance co., I was laughed at by both managers and progs. I am not sympathetic towards IT management, the main problem being the complete lack of qualifications/professionality in the IT business this I feel is still true of most IT staff of all types. Y2k is a typical example of a major IT cock-up, as is the failure of many major development projects in the public and private sectors. The IT industry has cost society dearly.
-- Richard Dale (email@example.com), July 07, 1998.
I think one of the problems in the software industry is that there is no watchdog. For example, a drug can't be prescribed in the US until the FDA has approved it. OSHA has to determine that factories are safe to work in. Quality inspections are done on every car that leaves the factory. Who's inspecting the software? Most of the time, the same people who wrote it. And did you ever look at the warrenty you get with software "This software is not licensed for fitness or merchantability for any particular purpose. The manufacturer will not be held responsible for any damages caused by the use of this software". Who else can get away with a warrenty like that? That's a far cry from "warrented for five years".
-- Amy Leone (firstname.lastname@example.org), July 07, 1998.
I worked at IBM on programming the first popular (selling in the thousands) mainframe computer, the IBM1401 (announced in 1959, delivered up through 1965 or so), which was a transistorized system with "core" memory - that meant that every bit of internal memory was created by a tiny magnetic core ring with three wires through it. Typically these computers had a total of 4K of internal memory - and that was 4K of 6 bit characters. Packed decimal (requiring 8-bit bytes) did not appear until the IBM 360 in 1965. As I recall, to add another 4K to the 1401 cost around $100,000, and was like adding another several cubic feet to hold the core memory and associated wiring. The average instruction length on the 1401 was about 6 characters long (ranged from 1 to 8 characters). So there was room for maybe only about 600 instructions, unless program overlays were used, which were very difficult to implement since the early 1401s had no disk memory.
Much of the programming of 1401s was to replace older punch-card accounting machines, and was very oriented to the 80-column format of punch cards. In fact, it had 80 specific memory positions in it for reading in 80 column punch cards and another 80 positions for punching out cards. The machine did not have the capabilities available to the programmer for doing binary arithmetic - it was inherently a decimal arithmetic machine. So any kind of date arithmetic of the type proposed would have been either impossible or extremely complex and impossibly slow. And the costs of doing so would probably have been on the order of the costs now being spent to rectify the problem - sufficiently high to have made computing uneconomic.
Admittedly, at some point, programmers should have changed their ways - but virtually every new system tends to use data from an old system, or have some mechanism to allow translation or emulation of older programs. So programmers tended to continue doing it in the old way so all the pieces fit together. And often the programmer had little choice. I programmed in at least one language in which the date format was fixed by the language as xx/xx/xx style.
In any case, hindsight is always 20/20. By the way, Richard, in the programs you are now programming are you allowing 4 positions for the telephone area code, or 10 positions for Social Security numbers, both of which will probably be needed in a shorter time frame than the time span that we faced in 1961 between then and the millenium? Are you insisting that your company buy only PBX switches that will work with 4-digit telephone area codes? <<<<<<<<<<<<<<<<<<<<<<<<<<<>>>>>>
-- Dan Hunt (email@example.com), July 07, 1998.
Sorry about the references to 4-digit area codes and 10-digit Social Security numbers. These will be needed in the U.S. I failed to notice that you are apparently posting from the UK, where such future problems may not be arising in the next 25 -50 years. ................
-- Dan Hunt (firstname.lastname@example.org), July 07, 1998.
In response to Dan Hunt. Dan you make some very good points, back in the days of the 1401 I would agree every opportunity to save storage/CPU time had to be taken. However I first came on the scene with the 370 series by which time there was no real excuse not to use perpetual date routines, as you say 370s have been around for 33 years. Many systems have been written and rewritten since then, the fact that old 6 char dates were perpetuated in the rewrites or new systems is really unjustifiable. It is easy to say that hindsight is a marvellous thing, but I remember trying to convince my colleagues to use century in the dates, they only did so if it was known at the time that future dates in the 21C would be generated within a short period. I am now assessing some recent client server so-called state of the art packages, two of them are very non-compliant indeed, they could have only been written in the last 5 years or so. In fact 25% of ALL of the PC/Client server packages/applications we use are non- compliant. (They are non-compliant due to the use of 6 digit dates). I agree there is always a problem in allowing space in fields for future expansion, how many digits do you allow for financial values (for future inflation or increase). I have come across as I expect we all have, examples of financial values being truncated, eg nearly losing #1m in the process. However nowadays in view of the amount of extraneous information designed into computer files, its best to err on the side of caution when assessing field sizes. Most DBMS also now have null suppression, ie unused chars are not stored (eg ADABAS). If your computer language only allows xx/xx/xx then you are limited to windowing (were possible). Thank you for your well-considered response. Regards Richard
-- Richard Dale (email@example.com), July 08, 1998.
I think it was reasonable to use 6-digit dates when the systems were originally designed, and even to perpetuate them through the 80s. Most companies had grand plans for replacing their old systems during the 80s, and didn't really understand just how hard that was going to be. I suspect in the late 80s a lot IT managers were busy defending themselves for having blown huge amounts of money on development projects that either ended up on the scrap heap, or were years behind schedule. And much to everyone's surprise, the old systems were still there - with a huge backlog of maintenance that was on hold because everyone was waiting for the new system.
The people I would hold responsible are the managers who didn't get moving on this in 1990-1992, when it would still have been possible for most organizations to get it done fairly painlessly. Why didn't they? I think mostly because nobody thought they would be around when it happened. I remember saying 10 years ago that in 1999 there would be a massive musical chairs game where everyone in the industry would change jobs to avoid responsibility. I still expect this to happen..
-- Deborah Barr (firstname.lastname@example.org), July 08, 1998.
I've said it elsewhere, and I'll say it again. As a programmer I worked for MIS which, in the 60s and 70s, which was accountable to the Comptroller at Canadian Pacific -- the head bean counter. Do you think HE had time for frills like extra memory to carry the century in an expensive computer? Would he have listened to ME if I'd suggested it?
And now, 30 years later, what happens? In a report in the business section of the Globe and Mail (Toronto) 2 weeks ago, the writer refers to how IT managers have just about thrown up their hands trying to convince corporate manageement that yes, right now, in the middle of 1998, there is a REAL problem with y2K.
Don't be so hard on IT. There is a delightful human tendency to procrastinate, and we ourselves have all brought this problem on. All of us.
-- Steve Francis (email@example.com), July 08, 1998.
Just to expand on a point: I,m now looking at systems developed in the last 10 years for a client/server environment, many of these are completely non-compliant. Two large packages for example, Talisman (system for paying contractors and invoicing clients) & ResSearch (semi intelligent! system for matching recruitment requirements vs resumes) simply will not work in 2000 or before. These are not old mainframe systems designed with CPU cycle, main memory or disk space constraints in mind. They require massive changes to make them compliant. 25%+ (about 60) of the PC or client server applications/packages FI Group use are non-compliant. Whose fault is that!? Best Regards Richard
-- Richard Dale (firstname.lastname@example.org), July 09, 1998.
There were many factors leading to the Y2K suite of problems. Habits, the inherent differences between human and machine "thinking", conscious efficiency goals, difficulties in changing formats that had been widely implemented, management inertia, shortsightedness ...
But consider _this_ one: It's been over 98 years since the previous century rollover. How many people who were adults conscious of, and responsible for, adapting to the change in century number in 1899-1900 society had any significant impact on the design or development of electronic computers?
None, or very close to none.
The people who designed and developed electronic computers _had never lived through a century change_ !!
The majority of people who lived through the 1899-1900 century change at an adult age were dead by the time electronic computer development got going in the 1950's. Of those still alive, how many can one reasonably expect to have given informed counsel to computer designers on the century change?!? How many designers of electronic computers receiving such counsel, if any, could reasonably have been expected to extrapolate the impact of the 1899-1900 change to a computerized society 40 years in the future?!?
-- Richard B. Woods (email@example.com), July 10, 1998.
Richard Really it is just a simple date routine not a philosophical matter. Its just as easy to code up a compliant routine as a non-compliant one, even using windowing and 6 digit dates. Still you make some interesting points. Best Regards Richard
-- Richard Dale (firstname.lastname@example.org), July 13, 1998.