Evidence of potential damage

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

I found out about the Y2K threat last week and have been obsessing about it. I'm not a techie or anything (actually I'm a philosophy grad student; of course there will be a great demand for philosophers after TEOTWAWKI). In part of my efforts to understand the likelihood of various scenarios (from inconvenience to absolute disaster) I have been putting together a brief of sorts, collecting different pointed bits of evidence and putting them under the relevant headings.

One heading that is virtually devoid of any evidence is: "Evidence regarding the likelihood of different kinds of damage." In other words: I am looking for *hard evidence* that systems with unremediated Y2K bugs, or with other sorts of problems associated with Y2K resolution efforts, will have various kinds of (bad, good, otherwise) consequences.

It appears that the alarmists think (without evidence? I can't tell) that chances are *excellent* that unremediated/poorly remediated/untested/too-quickly-installed systems will simply crash or be utterly useless. This is an easy way to think about the problem but it is far too simplistic. As the naysayers often point out, the whole thing may turn out to be a dud, because poor Y2K remediation efforts *might not* cause most systems to crash or to be useless. And so of course an absolutely *crucial* question to ask, if you really do want to get a well-informed view on the whole problem, is: What are the immediate effects of the bugs likely to be?

It won't do simply to say, "They could be disastrous, so prepare for the worst." I'm sure that's not bad advice. But look, I want to be as well-informed about all this as possible!

It also won't do to say, "That's not what's important: it's the interconnected consequences of even relatively minor bugs that is the problem." That might be the case. But it is a separate consideration and I want evidence on *this* point.

It also won't do to say, "We just won't know until the day arrives." Again, I'm sure that that's true for some systems, and that there are going to be ugly ramifications that virtually no one can predict. But on the other hand, if we can find out enough about the failure rates of systems in varying stages of remediation and testing, then we *can* guess at a *minimum* failure rate. We can at least say: "It's probably going to be worse than *this*" (given that X% of systems are unremediated, X% are untested, etc., etc., you supply the relevant categories in different industries).

In other words (forgive and correct me if this assumes anything wrong), I figure that perhaps some agency might have done some studies about the failure rates (for *various kinds* of failure) of an average-sized mainframe or program, at varying levels of remediation and testing. E.g., perhaps there is some way that we can discover (or this has already been thoroughly studied) that the average aging mainframe of type X, which has been remediated but not tested, simply fails Y% of the time. Or *some* such general claim. That's the sort of thing I'm looking for -- together with detailed lists of the sorts of failures.

(I am asking this for some programmer friends of mine.)

So here's what I'm looking for: * Lists of actual Y2K-related failures (hopefully, with explanations for the failures); anecdotal evidence is only so useful, but it can be helpful. * Any sort of statistical evidence regarding Y2K rates of failure *of varying kinds* (see above); in individual industries or among businesses as a whole * Impressions and blind guesses about this issue from *real* experts

Any help, *especially* any *links* to hard evidence, greatly appreciated. Mere speculation from nonexperts will not be helpful (however interesting). Thank you very much --

Larry Sanger Sanger.3@osu.edu

-- Larry Sanger (Sanger.3@osu.edu), May 12, 1998

Answers

FEDERAL COMPUTER WEEKLY

Year 2000 problems sink Coast Guard systems

BY BOB BREWIN (antenna@fcw.com)

NORFOLK, Va. -- The U.S. Coast Guard already has experienced computer systems failures as a result of the Year 2000 bug and in at least one instance has not yet fixed the problem.

Commander James Decker, strategic information technology planner at Coast Guard headquarters in Washington, D.C., said, "The Coast Guard has already experienced a number of failures" in its computer systems due to Year 2000 problems.

Decker, speaking at the Navy Connecting Technology Spring '98 conference here, said systems affected by millennium bug problems, include the service's pay system, the Marine Safety Inspection system and systems at the Coast Guard's internal institute, which offers correspondence courses to enlisted men.

Decker said the systems failed because they manipulate data that contain dates occurring after Dec. 31, 1999. For example, the pay system handles allotment for mortgages that run 20 or 30 years. "That [problem] has not been fixed yet," Decker said.

The Marine Safety Inspection system, which contains the databases of inspections the service conducts on cruise ships, failed because it also manipulates data that contain next-century dates.

-- Nabi Davidson (nabi7@yahoo.com), May 13, 1998.


see the latest from the Gartner Group: http://www.house.gov/ways_means/oversite/testmony/5-7-98/5-7bace.htm

-- David Binder (dbinder@sympatico.ca), May 13, 1998.

You might go to www.dejanews.com and do a power search for power grid. You'll get a lot of chaffe, but there is some grain in there too, and it may lead you to see why as of 19980513 I'm so pessimistic about the availability of electricity after 19991231.

-- George Valentine (GeorgeValentine@usa.net), May 13, 1998.

Welcome to the Y2k ranks. It will be very interesting to have you share your research with us as I've wondered about those same things. Yet, don't wait until ALL the "facts" are in to plan for your well-being. We have been working on this for almost a year and find that even with the early start, we won't be able to do everything to prepare we'd like.

-- Cindy (cindy@bigfoot.com), May 13, 1998.

Larry,

Check out the article at the address below:

http://www.techweb.com/wire/story/y2k/TWB19980416S0005

-- Nabi Davidson (nabi7@yahoo.com), May 13, 1998.



Thanks for all your replies so far! Please keep them coming, especially any links you have to original studies (like that Cap Gemini study -- well, I'm going to go look for it right now myself) and surveys, etc. The "harder" the data the better of course.

-- Larry Sanger (Sanger.3@osu.eud), May 13, 1998.

http://cgi.pathfinder.com/netly/afternoon/0%2c1012%2c1990%2c00.html

Check this out. Not one Y2K utilities test has been successful. Bye Bye power. This is really the pits! :(

-- Eddie (eddiegungus@mailexcite.com), May 13, 1998.


Here is a listing of actual Y2K problems from the Cassandra Project. It hasn't been updated in over a month, but it's still interesting reading.

http://millennia-bcs.com/examples.htm

-- Nabi Davidson (nabi7@yahoo.com), May 13, 1998.


Nabi's link to the list of Millennium Project examples is good. At the bottom of that page is a link to a page of examples of utility Y2K problems at Rick Cowles' site: http://www.euy2k.com/reallife.htm Both lists are very interesting, and I'm wondering if anyone knows of any other pages or articles like them. Also, even better, some studies and statistics (as opposed to just lists of individual incidents) enumerating types of problems encountered/encounterable. (There is one such list at www.mitre.org, I think.)

-- Larry Sanger (sanger.3@osu.edu), May 14, 1998.

http://www.garynorth.com/y2k/detail_.cfm/1163

-- Chuck Boyce (sg95m476@post.drexel.edu), May 14, 1998.


Here is another page of actual Y2K incidents:

http://www.granite.ab.ca/year2000/incidents.htm

-- Larry Sanger (Sanger.3@osu.edu), May 20, 1998.


See the Year 2000 Jounrnal - Jan/Feb 1998 issue. It outlines test at a coal fired "hot spares". Back issues can be found online at http://www.y2kjournal.com/ -- See the Year 2000 Embedded Systems Threat... by Roleigh Martin.

-- john hebert (jhebert@co.waukesha.wi.us), May 29, 1998.

testy

-- Max Dixon (Ogden, Utah USA) (Max.Dixon@gte.net), October 04, 1998.

Larry, all I can say is that in my own experience working for a large UK Insurance company (I hear yawn not again! from everyone), but if they had not done y2k remediation that they would simply have gone out of business. There were too many problems to fix on failure. If companies have old systems and do nothing the likelihood is they will go out of business. A whole host of programs would have crashed when first run after 01/01/2000 simply because the data they were looking for had expired on 31/12/1999. That one problem alone would have taken weeks to fix. Many hundreds of programs from many systems accessed data from this single database by using from/to dates. If computer systems fail for any length of time it becomes more and more difficult to recover/ provide business continuity. Of course its catch 22, you don't know what the problems are in detail until you search and find them, by that time you're doing remediation anyway.

-- Richard Dale (rdale@figroup.co.uk), October 05, 1998.

Moderation questions? read the FAQ