Does the bell toll for this Bell curve? : LUSENET : TimeBomb 2000 (Y2000) : One Thread

Even those of us who have tried to diligently understand the "technical" ins and outs of Y2K, probably also have a "feeling" about things - not just this issue.

I have been at odds with one of the most profound experiences I've ever had, which ocurred last December '98. In short, I had a very vivid experience - that somehow , the tide had turned, and that Mankind and the planet would be OK. This wasn't just in regard to Y2K. I continued my prep plans, which have been quite extensive, because the other part of me that can think, could clearly see the threat that loomed. Which brings us to our current situation.

I propose this assesment. Y2K remediation is not unlike any other activity - therefore, follows the same rules and laws of other activities. In anything, there is a Bell shaped curve which is representative of the resultant output of that activity.

It seems to me, that millions of systems have been correctly fixed, with only about 50 or less, reported instances of failures. Placing those numbers on a Bell Curve of "Succesfully Remediated Systems" would seem to put the Result at the far far extreme of the positive side of the curve. I would guess at the "only happens 1 out of a million times" part of the curve.

This suggests that since succesful Y2K remediation follows the same laws of the Universe, like ever other endeavor, the error rate we have too low, and we will start seeing a more "probable" error rate surface in the near future.

The other explanation is that, yes, this rollover was an example of how a 1 in a Million or Billion shot at something, actually can happen. To me, that's on the order of an intervention by "God", yet, I'm not saying it isn't possible.

I have talked to some polly friends that seem to think, "it's because we don't put our energy into negative things (like seriously considering the ramifications of no infrastructure) that Y2K was no problem." I don't believe that as a population, we have made any "shift" of consciousness or awareness. I don't believe thinking "positive thoughts" fixes of fixed bad code.

So, like everyone else, I watch and wait, confronted by my own "positive" experience of Dec '98, and the empirical thoughts of living in the real world.

-- Gregg (, January 03, 2000


Sorry - Should read - the error rate we have is too low- - fixes or fixed bad code.

It's early for me.

-- Gregg (, January 03, 2000.


With all due respect, the results of all activities do not fit on a bell curve. I'd guess most people successfully drove to work today. And I'll wager most of them drive safely home. The aggregate outcome of any activity is dependent on multiple factors.

Software remediation is a complex human activity... but it is also generally done by professionals who have specific experience and training. I imagine my drive home would be much easier if everyone on the road were a trained professional.

Most drivers make routine mistakes... but they are rarely fatal. Software also has a tolerance level. Microsoft has conclusively proven the world can run on "buggy" software. Y2K rememediation was never a question of perfection, but just getting it "good enough."

Thus far, we seem to have gotten it "just good enough," "laws" of the universe notwithstanding.

-- Ken Decker (, January 03, 2000.

Absolutely right Ken,

In fact most activities are not on bell curves but rather are on curves that have no tail on one side and a long tail on the other.

The software remediation curve will be very similar with most software having no or small problems. The real question will be what is the area under the long part of the tail (the severe consequence or failure part). If the area under that part of the curve is much more than 10% (that's a guess it could require more or less) it could be catastrophic for our economy.

Non Parametric Statistics is what we should be using for working thru these problems.

-- LM (, January 03, 2000.

Well, I'm not familiar with Non-Parametric Statistics. It just seems to me, that the chances of everyone fixing everthing that needed to be fixed, correctly enough to work, is a very small chance.

What seems more likely, is that most people did fix enough, but many (and now we are in to HOW many and the impact of them not fixing it etc.) did not.

I wish I could graph this here, it would be much easier to see, but the point is, what are the real odds of everyone fixing things?

To use your driving example Ken, aren't there 50,000 deaths per year from autos? If you divide that by the number of automobile "trips", you would get a "norm" or range of deaths/per trip, caused by autos. Numbers higher than the range, would have a smaller chance of occurring, just as lower numbers would.

Taking it farther, we have "new drivers" trying to drive on the road, and don't have past numbers to go from (as far as remediation) so the "range" of deaths is unknown, but the fact that only 50 problems have been discovered just seems to put that at the upper part of the curve, which has a very Low Probability of happening.

I don't want to argue about bell curves fitting activities or not, it depends on what you're using as parameters and what probabilities you're looking for. Suffice to say, I'll keep my fingers crossed, hope the finance systems holds together, and start trading Bonds again.

-- Gregg (, January 03, 2000.


There is a low probability "everything" was "fixed." There is also a low probability "nothing" was fixed.

The bell curve is used to describe a "normal" distribution. I highly doubt Y2K remediation activity is normally distributed. As correctly noted, it is probably heavily skewed to the postive. In short, almost everything was fixed.

The interesting question... what was not fixed and will it have any impact? This depends on what was not fixed. Obviously, "everything" to do with core infrastructure is functioning. This obviates most of the "end of the world" arguments. Now, we are left to see what chronic problems occur and assess the impacts.

Returning to your automotive example, in recent years the number of accidents have increased, but the number of fatalities have lowered (per mile traveled). This may be due to a number of reasons including improved safety technology.

You are correct in observing this number does not change radically from year to year... but it is also not a "one time" event like Y2K remediation. If we had to do Y2K remediation every year, chances are you'd eventually see elements of central tendency, though likely with the positive skewed curve I described earlier. Since Y2K remediation is essentially a "one time" event, we have no ability to determine if the 1999 success is typical or atypical.

Statistical analysis has its own rules. What you are describing, Gregg, is your intuitive sense that we might see more problems. Your intuition may or may not be correct, but it is dangerous to use the language of statistical analysis without the data or tools.

-- Ken Decker (, January 03, 2000.

Moderation questions? read the FAQ