The exchange of ideas from my earlier post on this subject has helped me go a step further is using the "Probability of Success" approach together with the systemic nature of y2k to develop a model that demonstrates the huge threat just ahead of us. First, allow me to recap the earlier post:

There has been a number of posts that analyze the probability of systemic failure by concentrating on a few key industries and postulating that if any one of them were to suffer a critical failure, economic paralysis would result. While this thesis may indeed be true, the debate is diverted to the actual probabilities of success one ascribes to each of these key industries. The net result is that the debate quickly gets bogged down into a futile number crunching exercise because these probabilities are essentially unknown. I think that the probability-of-systemic-failure calculation, may be of more practical value by searching for those independent economic (and even social) activities that form essential "spokes" that keep the economic wheel balanced and spinning without a wobble. So just like a wheel could keep on turning upon losing a few spokes, it would become unbalanced and begin to wobble which would quickly lead to catastrophic physical (and, for this example, economic) failure.

The benefit of approaching the calculation this way is that we put the greatest emphasis on the SYSTEMIC nature of the problem which, in fact, goes to the heart of y2k, and the least emphasis on the probabilistic aspects which, I think many would agree, is not well understood. Then by assigning very optimistic success probabilities to each of these "spokes" it can be shown that we are facing economic deterioration and eventual collapse because the risk of having at least one breakdown in a key economic (or social) sector would still remain high. There are only two selection criteria for picking an economic (or social) activity to become a "spoke"; it should be independent of all others, and should be considered a vital economic activity. Given below is my guess as to which activities would qualify together with my optimistic "Success Probabilities". Note that the chance of at least one critical failure is 62%. If I assign my "best guess" probabilities that number rises to over 99%.

1) Financial Sector (overall 92.17%) derived from banking 98%, Fed wire transfer 99%, tax proceeds (IRS/Customs) 95%

2) Power Sector (overall 96.05%) derived from generation 98%, transmission 99%, distribution 99%

3) Telecom (overall 98.5%)

4) Food Production (overall 98%)

5) Water Delivery (overall 98%) 6) Waste Disposal (overall 97.02%) derived from sewage 98%, solid waste 99%

7) Transportation (overall 88.52%) derived from rail 95%, truck 98% surface docking facilities 98%, air 98%, pipelines 99%

8) Government Oversight (overall 91.23%) derived from military/national guard /police/fire protection 99%, FAA 97%, all other support such as SS, Medicare, Medicaid etc. 95%

9) Extraction and Mining (overall (96.06%) derived from coal 99%, gas 99%, oil (domestic) 99%, metals 99%

10) External Factors (overall 66.34%) derived from oil importation 95%, foreign banking 95%, foreign manufacturing 95%, foreign air traffic control 95%, foreign docking facilities 95%, foreign surface shipping 95%, foreign ground shipping 95%, other foreign critical raw material imports 95%

11) Public Support of Infrastructure (overall 91.75%) derived from urban rioting 97%, bank runs 97%, job lockout (from bankruptcies or fear of leaving home) 98%, war/terrorism/series of natural disasters 99.5%

OVERALL PROBABILITY OF AT LEAST ONE CRITICAL FAILURE: 61.6% Now let's remember that this model represents a "snapshot" of the probabilities caused by the SYSTEMIC nature of y2k. This result does not rule out the possibility that some repairs will be made. After all, with every passing day the world is incrementally closer to fixing all y2k problems. But the key question is:

Given the overwhelming probability that there will be critical economic failures as we move into the next millennium, will the TREND of the "Probabilitiy-of-Success" FOR THE ECONOMY AS A WHOLE worsen or improve AFTER the new millennium begins?

I submit to you that if I were to prepare a new probability data table in January 2000 similar to the type I have shown above, the average of the "Probability-of-Success" values in that table would be LOWER not higher for the following reasons:

1) Nonproductive (meaning military, police, firefighting which are expense items) segments of our economy would be put into force at a much greater rate than normal for crowd control and security, which would create a much higher than average economic drain.

2) An initial economic dislocation of the magnitude I foresee would have a significant impact on bankruptcy, worker productivity, job absenteeism, all of which lower productivity.

3) To the extent that y2k hits computers directly, productivity will be reduced substantially ("just-in-time-inventory-control" is but one example).

4) Trade and commerce will slow down drastically, particularly those elements dependent upon overseas goods (which affects just about every body these days). Thus, vast sectors of our economy will be extremely vulnerable to failure.

5) Consumer confidence is likely to take a drastic hit (stock market plunge) which will restrict economic activity, raising unemployment and reducing corporate earnings.

6) Even a FEW key corporate/government agency failures are likely to have a domino effect on the rest of the economy ("division-of-labor" argument).

7) The ability to continue fixing y2k problems will be substantially more difficult because programmers will be forced to work under "fix-on-failure" conditions -- a repair mode with HUGE disadvantages.

8) Even successfully operating contingency plans will significantly reduce corporate/government agency productivity (by definition, at least) which will further hamper economic activity.

There are many more reasons I could list, but please don't miss the point here. While no one can predict the EXTENT to which the items I've enumerated will reduce these "Success Probabilities", they WILL BE REDUCED and to argue otherwise puts one in the wishful thinking camp.

There's another way of looking at the entire y2k enigma that should give us further insight into the probable trend of the "Success Probability" for our economy as a whole. If the Social Security Administration needed 11 years to get their INTERNAL systems y2k compliant and tested (according to them, anyway). and Kemper Insurance, which has been using a 4-FIELD DATE since 1987, and has been working on remediating their 30 million lines of code since the spring of 1996, has yet to announce that they are compliant, what are the chances that the rest of the governmental agencies and other corporations will be compliant in January 2000 or even January 2001 for that matter?

Considering that I haven't even mentioned the embedded chip problem, or the tens-of-millions of data-exchange bridging and filter routines that will have to be designed tested and implemented AFTER EACH ENTITY HAS THOROUGHLY TESTED AND SIGNED OFF ON THEIR OWN INTERNAL SYSTEMS, I think it's clear that our economy could come to a screeching halt with nothing left to say but TEOTWAWKI HAS ARRIVED.

-- Dr. Roger Altman (, January 22, 1999


(Thinking aloud, "Roger, what am I gonna do with you?"


A lot of common sense suggests resilient communities.

(Did I say, "Resilient Communitities?")

Oops, I gotta go....


-- Critt Jarvis (Wilmington, NC) (, January 22, 1999.

The probability of at least one failure is 100% (much more than your number). We've already experienced a sector failure in the insurance industry. They failed in 1990. Did it bring TEOTWAWKI? No. They fixed their problem. Your extrapolation does not represent common sense.

Troll Maria

-- Maria (, January 22, 1999.


okay, but what about the 99 percent of the world that will not be part of a resilient community?


-- Arlin H. Adams (, January 22, 1999.


Whatever happened in 1990 was NOTHING like the confluence of economic forces that will be brought to bear simultaneously ON ALL SECTORS OF THE WORLD ECONOMY as well as the social reaction to them (ALL of which have a habit of feeding on themselves) at the turn of the millennium.

-- Dr. Roger Altman (, January 22, 1999.

Do you know that through math you can prove that we do not exist. Let's say N= the total number of intellegent beings in the universe. The universe is infinitly large therefore there are an infinite number of planets. To figure out the averge population of each planet divide N by infinity which effectively is zero therefore you and I do not exist.

-- MAP (, January 22, 1999.

oh my

-- ivisable (, January 22, 1999.

Dear Roger:

Thanks for your excellent analysis. To add to it I would like folks to consider the embedded chip problem. Last September Mr. David C. Hall, an embedded chip expert, testified before Congress. Mr. Hall said to ensure Y2K compliancy, every critical embedded chip in the world must be tested. These chips cannot be "type tested" as there were no standards that were used to program them. Now there are billions of embedded chips that must be tested. He testified that anywhere from 1% to 15% exhibit some type of Y2K impact, the impact ranges from minor to catastrophic. As of September of last year, fewer than 10% of Enterprizes in the world had begun any serious testing. His greatest concern was that there was not going to be enough time left for testing.

Now let's consider the consequences of not performing complete world wide business testing and remediation. Let's say enterprizes now have no choice but to fix on failure. The CIA is concerned that Saudia Arabia and other Mid-East Countries won't be Y2K compliant. As well they should be. Just for fun, let's say a major oil company has an embedded chip in a well head to measure oil flow. Let's say it is not Y2K compliant and has a date function (one the engineers don't know about since it is a "one size fits all uses" that many chip manufacturers sold in the last thirty years). At Y2K rollover the chip fails and stops oil flow at the well head. Engineers investigate and fix the problem. Oil flows again, then an embedded chip fails at one of the separation units downstream of the well head. Engineers investigate and fix. Then a problem occurs in the oil pipeline. An embedded chip that is used to measure flow fails and shuts down the flow of oil. Investigate and fix. Failure at the Tank Farms, Investigate and fix. Failure at the shipping terminal at the port. Investigate and fix. Failure with the tanker. Investigate and fix. Oh but you say surely they won't plan to do it this way. Maybe tyhey will and maybe they won't. They really don't have many good choices. If they placed all of thier systems in an outage mode and tested all chips they would lose years of revenue. If they fix on failure, there is a chance that maybe, just maybe there won't be that many failures.


-- Bill Watt (, January 22, 1999.

If I had to do it over again, I would have ended my analysis differently. So patch the following to what I said above.

These results (the probability of TEOTWAWKI) are so one sided that they call into question the rational of continuing y2k remediation efforts at all. It seems to me that the huge amount of money and time currently devoted to y2k remediation could be much better spent by re-directing our efforts to disengaging computer systems from the 'network' and re-instructing them to work independently of one another. This approach would offer some degree of insulation from total economic collapse AND provide the building blocks for re-connection down the road when economic stability is re-established.

I know this idea is infinitely easier to express than to execute, but at least we would be working TOWARDS a functional system in the future. The way things are moving now we may be wasting invaluable resources that could mean the difference between ultimate economic and social survival and total collapse.

-- Dr. Roger Altman (, January 22, 1999.


Interesting position. It does however, miss the fact that olur failures NOW are separate, distinct, not cascading against others, and spread along a nice wide time line. We have time to pull people to an area, fix the problem, and go to another area. Compress the time line and cascade the failures and, while I don't like the model above, I suspect you might come to a similar conclusion. ( I have my own problems with any analysis which includes " a little comon sense" as this is the place the flaws enter into the analysis)


-- Chuck, night driver (, January 22, 1999.

geez...with Dr. Altman and Mr. Milne floating around, who needs Jeckle & Hyde...

-- a (a@a.a), January 22, 1999.

Roger: What is your educational and professional background?

-- a (a@a.a), January 22, 1999.

Dr --- are you familiar with Harlan Smith's "auster infrastructure" paper? Your patched conclusion sounds like it is down a similar line. But, as you probably agree, this will be a post-Y2K effort and, probably, the way all future systems are designed.

Re your post-Y2k analysis with which I thoroughly agree. Carmichael (I think it's him) has postulated a post-Y2K featuring 'x' degree "viscosity" (things still happen but very s ... l ... o ... w ... l ... y). Very apt metaphor, almost at the reverse edge of Y2K going exponential but same point: if viscosity reaches a certain point, the entire system locks up and seizes.

Queston: I missed your first thread, sorry. Is probability of critical failure meant to refer to failure of a spoke or the whole wheel?

Thanks for putting so much work into this and ignore cavillers. Of course, the math is speculative! But it's no more speculative than the meaningless compliance percentages that everyone loves to fight over. Less so, probably.

-- BigDog (, January 22, 1999.

Roger - I could pick at your math all day and get nowhere. But your analysis does assume there exists only ONE water system, ONE electric system etc. This gives an incorrect result.

In other words, if say 5 municipal water systems in the US have critical failures - this does not mean no one has water in the US. If 19 electric generation units fail - this does not mean no one has electricity. Your analysis fails to convince because of this hidden assumption.

-- Paul Davis (, January 22, 1999.

Big Dog and Paul Davis:

Thanks for the commentary and criticism. Without either one I wouldn't bother to post at all. I think Paul's criticism is excellent, but allow you to give you my 'take' for what it's worth.

Let's take the "sewage system" as one of the many possible "spokes". No doubt, we have many, many independent sewage systems. So would 95% (to use a number) "Probability-of-Success" have any meaning?

I believe that every one would agree that they would be much more comfortable having thousands of INDEPENDENT sewage companies with a probability that 95% will function at the turn of the millennium, than to have a SINGLE sewage company with a 95% operational reliability. After all, there's a 5% chance that the whole country could get backed up, so to speak. If the whole country were without sewage disposal for even a short time (and the model assumes a CONTINUOUS 95% reliability) then disease would spread very quickly throughout ALL urban and suburban areas. In this case, the threat of economic collapse would be the least of our worries since our very lives would be threatened from the start.

So while it is true that 95% "Success Probability" really refers to the thousands of INDEPENDENT sewage plants throughout the country, the loss of 5% of them would be more than enough to create an imbalance in the (economic) wheel causing repercussions far outside those affected areas (not only due to spreading disease, but also due to our computer based communication, supply chain and other interdependencies). Couple this impact with a kaleidoscope of simultaneous problems (which is what the model is focused on -- a "snapshot" of "Success Probabilities" at the turn of the millennium), and I think you might conclude that although far from perfect, the model does give us a modicum of insight into life a scant 11 months from now.

-- Dr. Roger Altman (, January 22, 1999.

Big Dog and Paul Davis:

Thanks for the commentary and criticism. Without either one I wouldn't bother to post at all. I think Paul's criticism is excellent, but allow me to give you my 'take' for what it's worth.

Let's take the "sewage system" as one of the many possible "spokes". No doubt, we have many, many independent sewage systems. So would 95% (to use a number) "Probability-of-Success" have any meaning?

I believe that every one would agree that they would be much more comfortable having thousands of INDEPENDENT sewage companies with a probability that 95% will function at the turn of the millennium, than to have a SINGLE sewage company with a 95% operational reliability. After all, there's a 5% chance that the whole country could get backed up, so to speak. If the whole country were without sewage disposal for even a short time (and the model assumes a CONTINUOUS 95% reliability) then disease would spread very quickly throughout ALL urban and suburban areas. In this case, the threat of economic collapse would be the least of our worries since our very lives would be threatened from the start.

So while it is true that 95% "Success Probability" really refers to the thousands of INDEPENDENT sewage plants throughout the country, the loss of 5% of them would be more than enough to create an imbalance in the (economic) wheel causing repercussions far outside those affected areas (not only due to spreading disease, but also due to our computer based communication, supply chain and other interdependencies). Couple this impact with a kaleidoscope of simultaneous problems (which is what the model is focused on -- a "snapshot" of "Success Probabilities" at the turn of the millennium), and I think you might conclude that although far from perfect, the model does give us a modicum of insight into life a scant 11 months from now.

-- Dr. Roger Altman (, January 22, 1999.


There's another way to look at the point you brought up. The "Overall Success Probability" value corresponding to a specific economic sector (that I gave in my initial post) is actually the product of individual operations throughout the country and their dependencies. Each operation (depending on how well they and their support network performed their individual y2k remediation program) would then have its own "Success Probability" value. Even though it would be impossible to know this number for each and every operation, it could be estimated from a properly designed sampling program. I know this probably will never be done in the short time remaining, but at least the premise of the model is not at fault. If that part is true then the instructional value of the model far outweighs our lack of specific "Success Probability" values because the model demonstrates the systemic nature of y2k and the cripling effects it will have on the economy.

-- Dr. Roger Altman (, January 22, 1999.

I couldn't find your other post but do you at some point explain this mathematical model? If you assume dependencies among these systems, does your model account for these dependencies? Or are they stochastically independent? How do you find at least one failure? Are you using the binomial?

I've gone through many iterations of these probability calculations down to the point of analyzing bank failure based on a number of them converted and a certain probability of failure for those converted and those not converted. I used the normal distribution (which can be used for large n, in this case 8000 banks). I also went as far as computing the probability for exactly one failure, exactly two failures and so on. The obvious answer, of course there will be at least one failure. The main question is at which point does the infrastructure breakdown? Yeah, they are all interconnected but how much does it take to bring down the system of systems? I have come up with all the numbers but unless you can answer this basic question, the numbers are meaningless.

I have another problem with the definition of failure. Hence, my comment on insurance companies. Failure in my mind is uncovering a problem which can be minor or critical; it doesn't mean the business or its service goes away. Americans are inventive and can come up with workarounds.

Troll Maria

-- Maria (, January 22, 1999.

(2nd request) Roger: What is your educational and professional background? uh, if you don't mind?

-- a (a@a.a), January 22, 1999.

Dr. Altman - have read your posts, I go mind numb, numbers are not my forte, sorry :-), but I get the message loud and clear. As others have pointed out you are using another vector to arrive at about the same place.

I positioned myself as a InfoMagic 10minus a Nuke War on Yourdan a couple of weeks ago in the Short/Long Preparation thread. But since then I've reassessed. The following is a short excerpt from some private email:


Assumptions, ie. High probability items.

The following are for just this country but hold for all the national and private players possessing these materials and systems:

I'm assuming at least one chemical & one bio warfare snafu which will be deadly. I'm assuming at least one nuke final processing plant will have a deadly accident, and at least one long term storage facility will experience lack of maintaince resulting in waste spill of magnitude. I'm assuming that at least one nuke plant will go deadly. I'm assuming that there will be more than one accidental launch of nuke war material. I'm assuming there will be at least one major deadly accident involving nuke sub reactors (less probability domestic, higher foreign). I'm assuming that there will be more than one riverine system and associated ocean and current which will be deadly contaminated by nuke accidents.


I assign these as assumable probability because of the complexity of the systems, the absolute numbers of material involved, and that "things happen" even in the best of times. These just happen to focus upon war machinery.

Certainly within any of the life support systems web, the interconnection density enhances the chances of system failures cascading into spreading failure mode. The numbers required for this mode are not material particularly, being overshadowed by the density factor.

I've been asking this question for over a year now, and asked it earlier today on a thread in Yourdan. One assumes that cascading failure modes of the Life Support systems will ultimately, on their own, generate civil disorder. Let's assume the population remains calm until then for sake of clarity in this question. People write about y2k recovery, using timeframes ranging from a few weeks to generations. These recovery timeframes have hidden assumptions: The one's normally talked about such as electricity, telephone, transportation, and commerce need no elaboration. The more hidden assumptions involve conservation of the knowledge and skills base upon which recovery is dependant.

If the recovery timeframe is a few weeks, or a few months, it would be fair to assume that knowledge and skills would be conserved, in the main. Civil disorder contained with no large scale burning of cities, no large scale death due to disease or violence. Maybe a factory, library or random University sacked, or a few key personnel gone, but overall - recoverable.

Yet, if one projects out past the fall harvest time 2000 and recovery isn't underway then, problems arise. Chances are, in that scenario, that civil disorder was more intense, chances are that the usual Iron Triangle group were not able to get back up sufficiently. Chances are that the harvest will have problems, either at the field, or ultimately, distribution to the populations. Under those circumstances, conservation of the knowledge bases and skill bases will begin to erode. More factories, libraries, industrial centers, and skilled labor will be trashed or dispersed.

Following this logic train out to several years disorder, one could conclude that erosion of too many factories and too much of the knowledge and skills base had proceeded, preventing recovery. Even if one assumes under five years of low to mid level disorder, there would surely be an erosion of knowledge and skills, there would surely be many closed, unoperable, facilities.

Harlan Smith advocates the Austere Infrastructure. What are the Mission Critical systems absolutely necessary for recovery in one, two, five, ten, or a generation of years? The longer the recovery the less likely knowledge and skills will remain clumped and usable, or even in existance.

It would seem that named Mission Critical systems are completely dependant upon subjective estimates of Recovery Time.

MC systems chosen for a one year expected recovery will very likely not be the MC systems chosen for a 5 year expected recovery, and definetely will not the the MC systems chosen for longer expected recovery timeframes.

The question is: past what point does infrastructure loss and dispersal of the knowledge and skills base negate possible recovery into a Life Support system resembling the one in which we are now embedded?

Is it safe to Contingency Plan assuming two or under two year recovery timeframes? If one assumes longer timeframes, what would you suppose alternate courses of present action be, not only in Contingency Planning, but in Long Term Planning for a post y2k world?

Thank you.

-- Mitchell Barnes (, January 22, 1999.

I see lots of questions. Lots of questions. That's good. OK, as for my background, I have a doctoral degree in engineering science and have been programming off and on for about 30 years.

The model calculates the probability of success of ALL mutually exclusive economic sectors and social events (without doubt there ARE heavy dependencies among ALL economic sectors, but, at the same time, there are INDEPENDENT triggers that can cause each sector to fail on its own). So 1 - this probability equals the probability of AT LEAST one failure. The model assumes no dependencies and I do not use a binomial. It is assumed that even one critical failure will EVENTUALLY cause the infrastructure to breakdown.

Yes, we Americans are hard working and creative. Believe me when I tell you we will need all of our skills and talents to survive this debacle I see coming. As far as contingency plans are concerned, I've already described the best course of action IMHO. And certainly, through statistical sampling and timely re-evaluation, the model can help us monitor our present economic state as well as its trend. Eventually, there WILL BE LIGHT AT THE END OF THE TUNNEL, and, who knows, the model may be a handy tool to spot it as early as possible.

-- Dr. Roger Altman (, January 22, 1999.

Roger, Excellenet analysis - check out the following thread which offers a mathematical proof of the Infomagic scenario. What do you think???

mathematical proof of infomagic here

-- Andy (, January 23, 1999.

For Troll Maria - one insurance company? We are talking about systemic failures Maria - get a grip...

"There was a popular game this past decade called Jenga. Fifty-four wooden blocks, approximately <" X 3/4" X 4" are arranged three side- by-side, with three more side-by-side on top of the previous three, but perpendicular to the layer below it. The third row is perpendicular to the second row and this pattern continues all the way to the topeighteen stories. Once the tower is assembled its about 14" high. The object is for players to take turns removing one block at a time from any place in the tower and placing the removed block back on top of the tower without toppling it in the process. The last person to remove a block without knocking the thing over wins. According to the makers of the game, expert players can transform the 18-layer tower into 36 layers (obviously there are lots of holes). From my Jenga experience usually about 20 to 30%of the blocks are removed and replaced before the tower crashes onto the living room table.

Jenga is a good analogy to how Y2K relates to the division of labor. Let each block represent 2% of the work force, or economic player if you will. If one block is successfully removed and placed on top of the tower without toppling it, assume, for the sake of argument, that represents 2% of the economy undergoing computer problems. As the tower generally stands strong in Jenga when only one block has been removed and replaced, so does the economy stay strong when only 2% of the participants are awash in computer downtime. Two blocks removed? Strong tower, strong economy. Only 4% of businesses struggling with computer problems. And so it goes. But eventually the tower crumbles. Just prior to the crumbling the tower appeared, for all practical purposes, as strong as the Rock of Gibraltar. And such is the concern with Y2K. We see a tower standing but dont care to consider the significance of the removed and replaced wooden blocks. In Jenga, if 51% of the wooden blocks are removed and replaced, 95+% of the tower falls, not just the 51% of wooden blocks removed and replaced.

And again, in Jenga generally if 30% of the blocks are repositioned that usually destroys the fate of the innocent 70% of untouched blocks (i.e. businesses, services, economic players). What doesnt happen in Jenga, and wont happen with Y2K, is untouched blocks, or businesses with functional computers, going unscathed when the minority of the moved blocks reach the crumbling point, or in the case of businesses, noncompliant computers begin to crash. Thats the impact of the division of labor. You could say the exposure of the surface area to a Jenga block is representative of the vulnerability or decline in a business. The greater the blocks exposure, the greater the damage to your business. You may be standing but youre weakened. If 25% of businesses have computer problems come 1-1-00 and youre not one of 25% you can rest assured your exposure and vulnerability parallels that of a Jenga tower thats up to twenty- seven layers or so.

Remember Deep Blue? The computer IBM designed to take on Gary Kasparov, the world chess champion at his own game? If my memory is correct Deep Blue won the majority of games. How would Deep Blue have done if its programs were flawed? It wouldnt move or if it did, it would move unwisely and lose. Running smoothly, Deep Blue, programmed by men with much less acumen in chess than Kasparov, was able to beat the champion at his own game. Lets pretend Deep Blue is crashed and has to run manually. The IBM men and women in white coats come running out from behind the curtain, they pool their knowledge and pour over hard copies of the programs they designed for Deep Blue. How efficient would this be? How long would it take the IBM eggheads to move their pawn if their level of competition remained equal to that of a fully operable Deep Blue? You might say this game would last for years.

An optimal functioning economy is dependent on Deep Blue, or the division of labor, working at normal operating speed. What evidence is there that a work force with 15% of its players hampered with computer problems simultaneously can produce as well as Deep Blue operating without a hitch? There isn't any. What evidence is there that Deep Blue could beat Kasparov if 15% of its programs were removed or shut down? Kasparov would smear Deep Blue, meaning the 85% of good programs still functioning in 'ol Blue were sent to computer hell with its tail between its programmable legs. Similarly, in our hypothetical example, the 85% of businesses not hampered by computer problems will take some severe shots, directly and indirectly, from the faulty members of the economic division of labor.

Understanding the division of labor is key to interpreting Y2K news, most of which is vague, bad and/or useless to begin with. Remember Jenga. Remember Deep Blue."


Two digits. One mechanism. The smallest mistake.

"The conveniences and comforts of humanity in general will be linked up by one mechanism, which will produce comforts and conveniences beyond human imagination. But the smallest mistake will bring the whole mechanism to a certain collapse. In this way the end of the world will be brought about."

Pir-o-Murshid Inayat Khan, 1922 (Sufi Prophet)

"We're doomed I tell ye, doomed!"

Private Frazer, Dad's Army, Walmington-On-Sea Home Guard, 1939 (Undertaker)

-- Andy (, January 23, 1999.

Damn! I'm spending my whole day on this forum, and I've got work to do outside! Mental interplay is addictively stimulating, and much more interesting.

The picture I get in mind with Roger's math and Infomagic's scenarios is a curve of recovery, upward or downward, something like a body under bacterial or viral assault. The question we are asking ourselves is: How will the societal immune system stand on January 3? (Remember -- there are no external "medicines" to be applied to the disease, no resources outside our own threatened "civilization".)

Systems might also be evaluated as to their contribution to the recovery mechanism, vs. "merely" keeping individual human bodies alive and fed, much as we'd all like to be the beneficiaries of such a prioritization. (Extreme examples: Sewage systems break down, but FEMA keeps FOF programmers alive and coding in sanitary workcamps. -- **Oh please, oh please don't blame poor l'il me if such folly comes to pass!!!** -- Or: Wired consumer telecom breaks down, but GOVT finds compliant wireless systems to take over to help nuke plants stay up. etc.?) Gov't emergency planners right now would not be doing their official jobs if they are not at least pencilling in such societal triage plans.

Thoughts based on Roger's math: If the math showed each system at 95% functioning -- even with the probabilistic multiplication going below 50% -- I think the bodily "immune system" (including public psychology in a positive feedback loop) would likely curve upward toward recovery. The math so far just doesn't give us the true dynamic picture. Things feed into each other in a less-deterministic manner than a single formula would give you. But it IS a starting point.

On the other hand, one major item -- like electric grid or agriculture/distribution going below 50% and possibly feeding back on itself (immune systems within the larger immune system) to lower levels -- could disproportionately negate the others staying at 95%, and then the multiplication would be more accurate. But one of Roger's multiplication results at 35% might produce recovery, another at 40%, failure. Depends, at least superficially, on weightings given each sector, and subparts within each reflecting their contribution to that sector's score. (Example: Electric grid could be regionalized, banking and telecom have more nationwide effects) DYNAMIC SYSTEM MODELING -- anyone familiar? I've heard that weather- forecasting systems challenge the computing capabilities of our current technology -- something like that is being attempted here. And economists have labored long and hard to program computers to predict the economy (and stock market). HA!

(Lest it be thought I'm suggesting the use of computers to model the computer-driven breakdown of computer-dependent society, I'm more suggesting we use the computer between our shoulders and the wise intuitions that pass through it daily. Also, the likelihood of our "deliberations" here or anywhere getting put into use is pretty negligible.)

One discussion design suggestion: Sectors that have their breakdown effects transmitted PRIMARILY through economic means, i.e., the production of a Depression / deflation /unemployment etc. should be viewed differently (as "Secondary" effects?) than ones that have IMMEDIATE systemic effects through physical cutoff ("Primary"?) of vital utilities and life support functions. A lot of Doombrooder argument flies out to grab these economic effects when unable to pin down effects likely in the latter, "Primary" grouping.

Infomagic's scenario has the overall interaction of societal functions going low enough to damage the immune recovery system (though I question whether the loss of population alone does this). The curve goes negative. We thus "go Infomagic". The study of the many critical pathways to Infomagic breakdowns will occupy us with fascinating regularity here over the months ahead. Some more valid than others, but probably our "facts" to plug in will come in an maddening jumble at year-end. So enjoy this "leisure" now while u can. (Perhaps someone could create a publicly-readable post-by- invitation-and-filtered-posts-only INFOMAGIC FORUM website?)

Figuring out the various failure scenarios is a crisis-engendered crash course in studying the interplay of societal systems we have daily taken for granted. THAT is what is most fascinating about y2k for me, and probably for many of you. We're hoping for an A on the final exam, but we'll settle for a P on a pass/fail.

The study presupposes an attempt by each of us to decide HOW MUCH to personally prepare, and for what level to prepare. But most of us will in the end just have to guess the level, and, as in buying homeowner's insurance, you don't have to actually BELIEVE your house will ever burn down to go ahead and get it insured. You just have to know that houses do burn down, and it is POSSIBLE for it to happen to yours.

The _usefulness_ of our study, in the long run, will be in helping direct recovery efforts in the most beneficial directions if y2k 7-10 happens, and in helping society decide if it wants to continue living under such fragility if y2k is essentially a bust.


-- (, January 23, 1999.

And I vowed myself I wouldn't start ranting into long posts until I had preparation all done! See what a little lurking can lead to? It's all about the search for good companions, which I think I've found here. It was a lonely-enough world BEFORE y2k issues came along!

There... I feel better already. Ready to read that generator book.

(Maybe we'll have to thread our y2k forums two different directions -- one for education/information sharing, the other for therapy. Need 'em both.)

I gots me a headache now.

-- (, January 23, 1999.

Jor-el, please stay addicted ;-) you've got me thinking about some new, valuable scenarios. Keep posting! Thanks :)

xxxxxxx xxxxxxx xxxxxxx

-- Leska (, January 24, 1999.

Jor-el ---- superb post, especially re the need to make our "infomagic" models more sensitive to the dynamic and varying impacts of how different sectors behave relative to the whole.

Hmmm ... I have capability to set up a filtered, "serious", by-invite-only forum ASAP. To me, this modeling isn't just wool-gathering: it relates to preparation this year, response next year and, if we're lucky, helping lead a post Y2K recovery that is intelligent. Let me think about this ..... if there are any interested parties, email me so we can discuss off-line.

-- BigDog (, January 24, 1999.


I checked out the Infomagic Y2k mathematical model as you asked. As a matter of fact, I had read one of these articles about a month ago which formed the basis for MY take using THE SAME underlying mathematical approach. All I have done is try to be as practical as possible by STRUCTURING this model in such a way as to minimize the impact of WAG (wild-ass guessed) "Probability-of-Success" values while preserving the SYSTEMIC nature of y2k by hand-picking industries and social events that, even individually, will (eventually) have a catastrophic economic impact if any one were to fail.

-- Dr. Roger Altman (, January 25, 1999.

Moderation questions? read the FAQ