Concerns with probability, uncertainty and stakes : LUSENET : TimeBomb 2000 (Y2000) : One Thread

I became aware of y2k about 18 months ago and have been following it fairly closely since that time. This forum has been an especially useful resource both in the content of its' posters and in the links that I find here. It is like having an army of scouts reporting back on what they find. I have just read two threads that have moved me to respond. The first was by Luann in response to 'you know who' and the second by Andy Ray setting forth his logic of proving an hypothesis wrong. It seems to me that A. Ray has misunderstood the nature of the issue and that Luann has acted with good judgement and wisdom. A. Ray first. A number of writers have tried to establish the fact that there is a significant non-zero probability that problems will occur on or about 01:01:00. They have set forth their premisis and made their reasoning clear. They have made specific predictions about what they think will happen. These are empirical predictions. In 3 months they will either be confirmed and the logic behind them validated or they will not be confirmed. Between now and then specific events can alter our sense of the probability for problems up or down but only the absence of problems on 01:01:00 can refute the case made. A. Ray seems to think this is like a political debate over the merits of something like abortion, where the issue is resolved by convincing enough people to agree with you. Convincing people one way or another has no bearing on an empirical hypothesis. The predicted event is the only relivant evidence that can be had. He can make a case for an alternative hypothesis and predict and alternative set of events If he likes. Some, such as Flint have already done this by arguing from a different set of premisis. They make the case that the probability of serious problems is relatively low. The event of 01:01:00 will tell us who is right.

In the meantime, since y2k is a historically unique event, debates such as have occured on this forum can be the only means of establishing a probability. We, each of us, can listen to the case made and reach our own conclusions but we are all pretty much condemned by logic to varying degrees of uncertainty. Those who don't abide by logic can, of course, achieve certainty by the force of will alone. 'You know who' seems to be of this realm.

How one acts in the face of uncertainty requires a distinction between probability and stakes. (This is not original with me, it belongs to a poster here some months back. I don't recall who.) When stakes are high, like the lives of our children and families, one acts on lower probabilities than when the stakes are minimul, like a bumped knee. I have become convinced that there is a moderate probability that serious problems will occur. I, therefore, feel that I have a moral responsibility to act, since the stakes are so high. Imagine a disease that afflicts and kills only one in 1000 children but there is a vacine for $100 that would prevent the disease. The odd are low but the stakes are high. Would you pay for the vacine? I would. If the stakes were a 2 day fever of 99 degrees, I might choose otherwise. I personally find it incomprehensible that a person, aware of the cases made, could remain confident that the probability of serious problems is at or near zero. Low enough, at least, that no action is merited regardless of the stakes. The absence of reported Joanne effects has given me a small measure of hope that maybe it won't be so bad. The computer at my college, an HP3000, crashed last week. The problem of gertrude, reported on G. Norths site gives me pause for thought. I hope with everything in me that it comes to nothing but I have to act on my best judgement. I and my son are preparing for 6 difficult months. I figure that by then skills from my youth in a logging wilderness will be called upon if it lasts longer than that.

-- Jerry Pudelko (, October 10, 1999


Jerry P,

Interesting discussion of risks vs. stakes. It has indeed come up several times on this forum; I also used it (to no avail) in my testimony before the Senate back in May.

I would take issue with you on one small point, though: the question of whether Y2K is indeed completely unique. Let's start by addressing your response to another poster's question about wearing seat belts in your car. You responded by saying that there is a lot of historical precedent with which you can compute probabilities more accurately. But that's only valid if you assume that YOUR driving and YOUR car and YOUR driving "situation" (i.e., the other cars you'll encounter, the weather, the condition of the roads, etc.) is statistically "similar" to the historical data upon which accident statistics are based. It's theoretically conceivable that the Sunday drive you'll be taking this morning is unlike any drive that anyone else has ever made ... but most of us base our strategies on the assumption that our car is similar to other cars, our roads are similar to other roads, etc. In other words, this morning's driving experience is "deja vu all over again" -- and that means we can take advantage of the statistical data we have from millions of other drivers, etc, etc.

I argue that the same is true, to some extent, with Y2K. Since I wrote this up in an essay that's available on my website ( articles/y2kdejavu.html), I won't repeat the whole argument here -- other than to say that Y2K is, in the aggregate sense, a large, complex computer/software project, and thus is likely to have the same characteristics of other large, complex software projects. We've got historical data about large, complex software projects, and while it may not predict the outcome of Y2K projects with absolute precision, it's likely to give us some useful guidance.

The pollys tend to dismiss this argument as a "tired" argument (whatever that means), but I believe that the emerging data from real Y2K projects tends to support my argument. Example: a significant percentage (but less than 100%) of large, complex software projects are way over budget. In the case of Y2K, we observe that the U.S. federal government originally estimated that it would cost $2.3 billion to repair 9,000 mission-critical systems. The most recent (Sep 30) estimate from OMB is that it cost approx $8 billion to repair 6,000 mission-critical systems.


-- Ed Yourdon (, October 10, 1999.

Jerry, do you wear a sfety belt when you drive?

-- space (, October 10, 1999.

Yes. The probability is moderate but the stakes high, In this case the probability is established by an extensive historical record. Many accidents have already happened and the relative seat belt no seat belt risks are reasonably well established. No need to establish the probability by process of debate. There is no historical record for y2k problems so uncertainty is magnified.

-- Jerry Pudelko (, October 10, 1999.

Jerry, You're right. Got to prepare. However, some people are not capable of action that will cause pain even when inaction will cause greater pain. Odd, but apparently true.

-- Mara Wayne (, October 10, 1999.

I'm not going to go searching the archived threads, but I know certainly that there is more than one discussing Y2K ...odds or stakes...

Diane? Chuck? You folks are coated with the dust of the archive stacks...can you assist here?

-- Donna (, October 10, 1999.

Setting the serious aspects of probability and uncertainty aside for just a second, let me mention:

Heisenberg may have slept here.

OK, back to the serious stuff.

Jerry B (no relation to Jerry P)

-- Jerry B (, October 10, 1999.

As far as waiting til 1-1 to see, I believe that the wait will be longer. I believe there will be significant visible disruptions on or about 1-1, but then we will be in the waiting game again to see what will really happen. There may be a 'bang', but the problems that must be dealt with on the longer term will be slower to show up. This will result in the DGI's claiming a quick victory, but this will only be the beginning. The second thoughts many are having now, will probably not be resolved around 1-1, but will likely continue at least into the spring. Practice patience and try to go a day at a time, it may be the only way to cope......

-- BH (, October 10, 1999.


Your disease metaphor is good. Of course that vaccine is wise -- you'd be a fool not to get it. But the advisability of being vaccinated doesn't change the odds (1 in 1000) of death if you don't. Those odds remain low.

Many (most?) of the pessimists here have been arguing that death without the vaccination is 'inevitable', and that those who make a good faith attempt to determine the actual odds are (1) claiming there is no such disease in the first place; and (2) trying to talk people out of getting vaccinated. Neither assertion is supported by anything more than mindless repetition, and both are wrong.

So by all means get the vaccination. But don't claim that everyone who doesn't will die.

-- Flint (, October 10, 1999.

Sigh. There are indeed similarities between y2k and other large complex software projects. There are also differences. If Ed Yourdon read Hoffmeister's many posts about the difference between repairing date mishandling bugs and doing major upgrades or new implementations, he chooses not to notice this or take it into consideration.

Similarly, Ed chooses the federal government as his sole example of an increasing budget. Yet it's been pointed out that the government budgeting process is NOT like a private sector company. In government, you ask for as much as you think you can get, each time you can. It's an incremental process in which the first estimate has little to do with the size of the task and everything to do with the budget allocation procedure itself. In the private sector, much more concern has been expressed here with the number of companies *underspending* their y2k budgets, rather than increasing them.

Finally, Ed sidesteps a key difference between remediation and new development -- that remediated systems are being steadily returned to production, and it's not happening all at once. In the real world, redeployment of repaired systems IS incremental, and by many indications largely complete. Ask any freelance remediator.

-- Flint (, October 10, 1999.

Of course, Flint, you could address these above items to Ed directly, rather than the passive-agressive use of the third person. Just a thought.

-- Donna (, October 10, 1999.

To Ed Yourdon I have read most every thing you have written on this subject and the case you and Steve Heller have made were more than any other the one's that lead me to take this seriously. For awhile the embedded systems issue had me in a real stir but several discussions and other reseach quieted my fears. They deal in intervals, for the most part and the the century unit, being a constant, is simply factored out. There is neither any need nor any sense to binding them to external clock/ calendar time. The problem that you have defined and set forth still remains as a serious concern with me since all too many of functioning programs deal with real clock calendar time and perform calculations using time based variables. I have this image of frozen CPU's coping with a value that doesn't fit the parameters of the variable, unable, even to tell me that this 'does not compute'.

As to the historical vs unique nature of the event, I think the distinction still stands. In the matter of car accidents and the types of injuries that one can incur with or without seatbelts the historical record deals with real actual events occurring in real actual situations. The probabilities can be calculated for all manner of sub- sets of this. Probabilities at 50mph vs 25mph, in ice or snow vs sunshine, are all base on real accidents at 50mph, at 25mph, in snow etc. So it is to with real programming tasks, large and small, well funded poorly funded and so forth. Based on this historical record one would clearly have reason to predict delays and the like. However, when you revert to an analogy you add a new factor. Now I have two considerations. One is the historical record and the second is the validity of the analogy itself. Is this massive systemic thing called y2k really analogous to a smaller but complex task. And does, what is true of the smaller historically known task, remain true even when switched to a massive scale. Both you and Hoffmeister use this analogy to opposite conclusions. You focus on the failures he on the successes but the same analogy is at work. The question that I would raise is related to the validity of the analogy itself. When I use the analogy I need to integrate the complexity of scale. Complexity is not a linear function of scale. It is, if anything, an exponential function. Someplace in the magnification of scale I lose it, but my intuitive sense of it is that it is more likely to come out rather like you suggest and not like the case made by Hoffmeister, who seems to see it as a linear function. As I incriment the scale, things like social panic, programmer panic, systemic breakdowns begin to emerge. I don't see any of this in the historical record of smaller but complex programming tasks. To revert to my to the driving case, I am now trying to calculate the probability of accidents while driving in paniced crowd trying to escape the volcanic eruption of Mt. Rainier. So far as I know there is no historical record for this kind of driving. Are seatbelts helpful or hurtful when hot liquid lava is factored into the equation?

To Flint.

I agree the probability is not affected by any action that I take or fail to take. It is whatever it is. Nor is a probability of less than 1.00 a certainty. I know this is redundant but it seems to be a redundancy that may fail to comprehend. No matter how certain I might feel in the matter, I can very well be quite wrong. It is true that 999 children will not die even if no one buys the vacine. But one child will die, we just don't know who's it will be, maybe yours, maybe mine maybe your neighbors but one will die. So it is with y2k. Maybe it won't happen. Maybe it will be a disaster for a few but not for most. We still don't know which few it might be.

-- Jerry Pudelko (, October 10, 1999.

The vaccine analogy is excellent. I'd like to add one more constraint, however: If more than a few percentage of people decided to vaccinate their children, there would not be enough. This makes the need to prepare now all the more obvious, because honestly, Y2K Impact Awareness could suddenly mushroom when we had otherwise had given up on it ever happening en masse.

82 days.


-- Jack (jsprat@eld.~net), October 10, 1999.

Jerry, if you're referring to "successes" as systems implemented on time, without errors, then you miss my point entirely.

But, if you are referring to "successfully" dealing with high rates of errors currently and in the past, then you are correct.

My optimism is not based on a naive confidence in IT ability to implement error-free systems. It is based on the demonstrated ability to deal with errors and failures, with little or no long-term impact, and certainly no "cross-cascading, systemic" failures.

-- Hoffmeister (, October 10, 1999.

I normally don't intrude on these stupid arguments. (I figure it's a waste of bandwidth. After all, you guys only have to wait a few more months to find out who's got the biggest coc--er... best argument.) But I have to point out that the vaccination metaphor is not quite accurate. A better metaphor would be this: there is a 1 in 1000 chance of influenza mutating into a lethal and here-to unknown type that will kill a good percentage of those who come in contact with it. You can avoid this flu by getting a flu shot. Anyone who doesn't get the flu shot most likely will die. However there is still a low probability that the mutation will happen. (Getting higher each year the more scientists mess around with pig and human DNA as well as cross-species transplants. And if this metaphor was true to life the flu shots would be useless, but I digress.) Is it still worth innoculating yourself against it?

-- Typhon Blue (, October 10, 1999.

Moderation questions? read the FAQ