READ ME 2! Infomagic Predicts Crisis, Part 1 of 3

greenspun.com : LUSENET : Sonoma County : One Thread

A few weeks ago I sent out READ ME 1 - Oil Crises, about a possible severe decline in oil supplies in the year 2000. I apologize for not sending out Part 2, our family moved in December and our lives are only just getting back to normal. But Part 1 had most of the good stuff, and if you read it you got the point.

Now we have READ ME 2! INFOMAGIC. Well over a year ago, someone began posting astute articles on the Y2K problem under the name of Infomagic. He was a computer systems analyst as well as an economic/social/geo-political thinker, and his analyses were penetrating. He predicted that the Y2K problem was so huge and would take so long to fix that it would not nearly be completed. The result would be an economic collapse. His writing became widely known in Y2K circles, to the extent that when people talked of a widespread breakdown, they would refer to The world going Infomagic.

For most of the past year Infomagic has not been heard from. Then, during the last week of December, he posted an analysis of where we are now, just before the countdown. I consider it probably the most penetrating analyses of Y2K I have seen. Unfortunately, his conclusions have not changed much. He still believes that, as he predicted, the problem is far from solved, and widespread difficulties will result. Like DD Reed in the Oil Crisis article, he also predicts a severe oil shortage. His writing is extremely incisive, and, however radical his predictions, he makes an excellent case for them. Even though we are at the eve of the Millennium, the realization of the true nature of the situation has not hit most people, so there may still be a very few weeks to make some last minute preparations. Once shortages begin to be apparent, it will be much harder. For those of you who have an idea of the severity of the situation, these thoughts can be a spur to the last few things to do, or an extention of what you have already done. For those who are not yet convinced of the danger, if Infomagics words dont convince you, nothing will.

The post is so long, I had to split in into three parts for e-mail purposes. You can access the original article at:

http://x26.deja.com/threadmsg_if.xp?AN=565458822&CONTEXT=946 339034.194229045

Alan

+++++++++++++++++++++++++++++++++++++++++++++++++

Subject: I was wrong . . .Date:12/27/1999 Author:Mail: Y2000 infomagic com

I was wrong, it WILL be worse A couple of years ago, I put forward the idea that the combined effects of Y2K and the collapse of the stock market bubble would lead to a "devolutionary spiral", somewhat analogous to the fall of the Roman Empire and the beginning of the dark ages. At the end of one of these articles I concluded "Of course, I could be wrong. It could be worse" In another article I showed, mathematically, that the Y2K problem could never be completely solved and that the residual failures resulting from the fixes themselves would be enough to cause significant economic impacts, even without failures from all of the unfixed systems. I also predicted, perhaps overoptimistically, that the stock market bubble would burst in the first quarter of 1999. I hereby admit I was wrong. It will be worse. I was wrong about the timing of that collapse, but not about the state of Y2K or about the eventual crash of bubble.com. I am so very, very sorry it didn't collapse when I predicted. Not because I want it to happen, but because the delay means that the final crash will indeed be much, much worse than it needed to be. If the market collapse had occurred at the end of 1998 or beginning of 1999, as it logically should have, there would have been a full year for everyone to revise their expectations and to make major economic, budgetary and business planning changes before Y2K itself slams us in the first quarter of 2000. A collapse back then could still have been partially controlled, the market did not have so far to fall, and the first business failures and unemployment situations could have been handled with a mostly functional infrastructure and support systems. In short, we would have been forced (and able) to make realistic, useful contingency plans instead of relying on the nonsensical, irrational and mathematically impossible concept that Y2K has actually been fixed. It is now completely inevitable that Y2K and the bursting of bubble.com will occur at one and the same time, leading rapidly to a new Great Depression far worse, far deeper and far longer lasting than that of the 1930's. Because of the confluence of these and other events, I am now also convinced that we will also see a complete breakdown of the global monetary system, a series of vicious dictatorships and, within ten years, World War Three. Instead of 66% over fifty years, I now expect more than 90% of the world population will die within the next ten years. But, again, it could be worse. If a major nuclear exchange cannot be avoided in World War Three, the human race will quite simply cease to exist. Under present global political conditions, I consider this to be a very real possibility, if not yet a probability. Either way, as a species, we have lost our one chance to get off this planet and to survive, long term, as a civilized race among the stars. (I am referring here to my own Devolution theory, to the Olduvai theory, to astronomer Hoyle's similar theory, and to other scientifically based approaches which predict that we have only one shot at developing a successful technology based civilization). I know this will likely get the pollys jumping up and down on their perches but, frankly my dear, I don't give a damn. The time for debate ended about a year ago and there is no longer any useful purpose to be served in countering the polly arguments -- their damage has already been done, the lives of innocents have already been recklessly endangered, and previously possible recovery efforts have already been sabotaged by the shortage of bits in their processors. We'll let them squawk later, when we clean out their cage in the bubblegate trials. Until then remember that, although able to repeat complex sentences over and over again when carefully taught, polly's comprehension of the related subject matter is limited by an intellectual capacity roughly equivalent to that of a three year old child. To regular readers, who know that I rarely insult my opponents, I apologize for the above. I also apologize to my own parrot for dragging him down to this level. But, after all the vicious attacks so many of the pollys have made on me over the last few years, just for once, I couldn't resist the temptation. It's about as close as I normally get to losing my temper, so please forgive me, because this article is not aimed at them but at those of you who are prepared for Y2K -- in response to the many eMail requests I have received to "come out of retirement" and present an update of my views. I am happy to do that, as a onetime event, just seven days before the rollover. But first, I'd like to tell you why I dropped out and where I've been. In the fall of 1998 I decided not to take any more Y2K contracts because I had already reached the conclusion that at that point remediation was a lost cause, and the only viable (partial) solution to Y2K was a massive contingency effort. It would have been dishonest for me not only to charge for work I professionally considered to be useless but also to allow a client to continue in the false belief that remediation could actually solve the problem. I have had talks with a few clients and some bodysnatchers about Y2K audit and follow up work, in the hope that I could do something useful to alleviate the awesome consequences of Y2K. Nothing serious came of this, however, in large part because so few organizations have done enough remediation to be worth auditing and validating. And of those that have, so many are still laboring under the delusion that fixing individual software problems is a meaningful act, while continuing to ignore the real problem of the dependencies and the inevitable collapse of the system of systems itself. At roughly the same time, I also renewed contact with "NymphoMagic", a lady friend from my misspent youth. And so, after a few hunting trips and a well earned winter vacation, with my own Y2K preparations largely in place, I decided to take a sabbatical and leave the digital dastards to their own devices. I spent most of this year in Northern California, renewing old friendships; sitting under a shady tree with a glass of fine wine; cooking my own special version of BeefWellington over a charcoal fire; enjoying the last days of my faithful but ancient Golden Retriever; working on a Java based Bible study program; and making my peace with God before facing the fall of Babylon and the great tribulation. In short, I had a life and enjoyed it. I even found time to occasionally follow the flame wars of CSY2K. But I must be honest and admit that I didn't find much worth responding to. As I said before, the useful debate was over about a year ago. We don't know for an absolute fact who won, and we won't for another month or so, perhaps as much as three months or even a year. But neither side will concede anything more than they already have. Our case is based on facts, logic, mathematics and prior historical experience, which the pollys simply reject out of hand. The polly case is based on nothing but emotion, corporate and government lies, greed, good "feelings" and an irrational belief that nothing could possibly go that badly wrong because the inimitable "they", the "authorities", will always fix it. It's the same, age old conflict between good and evil, right and wrong, right and left, nature and nurture, reason and emotion, freedom and serfdom, Christ and antichrist. It won't be settled until one side or the other starts dying in greater numbers. That won't be long now, but until then why waste any more energy arguing? For those, then, who are calmly preparing for the gathering storm here is my current weather forecast (not to be confused with Cory Hamasaki's Washington Weather report). Let's begin with the state of Y2K itself. In my expert professional opinion, the global prognosis is worse today than it was a year ago. Progress on remediation has been slower than even I expected and in many areas we have clearly regressed. In particular, the residual failure rate in "remediated" systems appears to be higher than I originally predicted. IV & V results show that the majority of these ostensibly repaired systems, almost all in fact, still contain potentially disastrous date errors, capable of crashing the system. In large part, this is due to the fixed deadline, the inevitable delays in the code change projects and the consequent lack of time for adequate testing. Even those best prepared, those who have been testing for two years or more, are still finding errors in their systems. These best of the best organizations tend to lower the overall error rate per million LOC but, obviously, this means that the residual error rate is much higher than average for those less well prepared than the best. Sadly, this applies to the great majority of all organizations and most of their so-called compliant systems will enter the new year still containing disastrous date errors. Closely akin to the residual errors, but far more insidious, are the errors remaining from inadequate and incomplete assessment and remediation, particularly when these tasks were performed by personnel not properly qualified to perform them. I have a good friend, an army Lieutenant, who is responsible for Y2K in the administrative section he commands at a local base. Discussing the issue with him last year, I found that he had checked the real time clocks on all of his PC's and reported himself compliant, honestly thinking that was all he had to do! Even after I pointed out that his real problems were things like software, spreadsheet formulas, database programs, macros and historical data; and all of these would have to be tested; he still had no intention of doing so. As far as I know he still hasn't properly tested his systems and he's still reporting himself compliant. In my experience, this situation is the rule rather than the exception in small organizations without direct, professional IT support. It makes a mockery of the claims of compliance by many government departments, by small businesses and, to a somewhat lesser extent, even by large, technically proficient corporations. If even we professionals admit we can't fix the Y2K problem, amateurs don't have the slightest glimmer of a chance. To make matters worse, many of the system replacement projects are also failing. We have seen new system failures at brand name companies like Hershey's, Royal Doulton and W.L. Gore. Businesses have been losing millions in lost revenues, even before the rollover, because these new systems (which may or may not be compliant) are simply not capable of replacing the functionality and capacity of the old systems they are supposed to be replacing. At least, not without considerable further development, refinement and testing, for which there is no more time. Similarly, many of the new federal, state and local government systems are failing the acid test of real world deployment. As we approach year's end, the reports of these failures are increasing, just as I predicted. But I strongly suspect that the number and scope of these failures is really much larger than their "owners" would have us believe. In retrospect, it has typically taken months for news of the failures to become public. Logically, the most recent failures, which are indeed occurring in greater numbers, have yet to be reported and probably won't be until next year. Ask yourself, how many of these failures have been openly reported (not even one) and how many have been covered up and lied about until the very last moment? How many replacement system project managers are just now reporting the failure of their project to meet the deadline? How many desperate, doomed attempts are currently underway to patch up the old systems they were supposed to replace? Finally, there are the systems that nobody has even tried to fix. The ones their owners are supposedly going to "fix on failure" -- the most ridiculous, brain dead solution of all. This category includes the vast majority of systems worldwide, probably more than 90% by number and certainly more than 50% by economic activity. Small businesses alone account for half of all our economic activity and yet almost none of them have done any meaningful testing or remediation. Most have done absolutely nothing, some waiting to see what actually fails and others still in total denial that there even is a Y2K problem. Of those that have addressed the problem at all, most have not made use of professional help and are probably in the inadequate or improper remediation group (as are most local governments). Very, very few small businesses are likely to have really fixed their problems in advance. In fact, except for my own, I don't know of a single local business I could honestly pass as ready for Y2K. And sometimes I'm not too sure about my own! Big business and large government agencies are no better off. Some really have fixed almost all of their problems, because they started early, planned well, financed and then staffed their projects adequately and allowed years for thorough testing in a complete, expensive, real world test environment. All they have left are a few residual errors as described above (although even these could still prove disastrous). However, such organizations are very few and far between -- probably less than 5%. So few that I cannot think of a single organization which I personally know or accept on their own assurance to be more than 70% ready, not even the clients for whom I have done Y2K work. The claims of compliance by the great majority of large corporations, and all government agencies (without a single exception) are nothing more than willful, flat out, bald faced lies based on the myth of "mission critical" systems and intentional or culpable misrepresentation of the actual remediation data. Let me explain this with a little history, for those who are not familiar with large scale systems development. Except for Antisocial Insecurity, which got started early, the US Grabit and most of the Fortune 500 didn't wake up to the reality of Y2K until the very beginning of 1997. (Smaller companies, most States and foreign governments and almost all local agencies were even further behind). At that time I was working for a large beverage company and I know for a fact that their top IT managers, with whom I worked, were largely unaware of the problem and it's consequences for the company. At this point, with only three years to go, it was already too late for most large organizations to fix all their systems in time to meet the only truly immovable deadline most computer professionals have ever had to face. Through the grapevine, those of us who actually understood the true scope of the problem (mostly system software and long range planning specialists) already realized the task was larger than the available time and resources. Regardless of our individual clients and employers, and working back from 2000, we all knew we needed at least a year for integration testing, a minimum of a year or two for all the application software changes and basic testing, another year to set up and test the extra development and test facilities, a year or two to bring the system software up to date, another year or two for system inventory, evaluation and planning. Let alone the minor tasks like finding and training the extra staff or contacting and verifying the compliance of customers and vendors. Even with parallel tasking, the remaining critical path simply could not be completed in time. For the system replacement approach the situation was even worse since, from initial start to finish, implementing a package like SAP is a minimum five year process for a large organization. In addition, before we could even start to look at the details of the problem itself, we had to convince non-IT management that the problem was real and extremely critical to business continuity (in Y2K terminology, this phase is known as "awareness"). For many, this phase alone took six months to a year and most large organizations, including the major bank for which I later worked on Y2K, didn't really get started with remediation until well into the spring of 1998. In short, for most large businesses and government agencies, totally fixing the Y2K problem never was a realistic possibility and the vast majority of compliance claims are and always have been a lie. From the very beginning I am certain that almost all of the real experts were completely honest about this and recommended to their management that the Y2K effort should be focused on the most critical and profitable business functions and the rest should be dealt with by contingency plans -- including the orderly shutdown of areas which could not reasonably be fixed in time. Those of us with a little more insight even recognized that, spread over the economy as a whole, the controlled and uncontrolled shutdown of unfixed and unfixable business functions would inevitably lead to a minimum of a severe recession and advised an adjustment of the overall business plan to take this into account. I know I did, even though this led to my leaving more than one Y2K project. No, the decision to lie about Y2K was not ours. It was a political decision made at the highest levels of business management and government. The first lie they told was to themselves -- that the problem could still be fixed, if only enough pressure could be brought to bear on the stupid peons who actually write the software. Quite simply, in arrogance and ignorance, management didn't believe what they were being told by their own experts and they simply dictated an unreasonable schedule to fix everything anyway. In so doing they drastically reduced the effectiveness of Y2K remediation (by spreading time and resources too thinly) and virtually guaranteed there would be no accurate reporting of real project progress, no feedback on what was really happening. On more than one occasion I was personally asked (actually ordered) to falsify my time records to show that I had been working on a later phase of the project when in fact I had still been working on an earlier phase whose completion deadline had already passed. Certain levels of management above me (other than my own immediate manager) were simply too scared (or too ignorant) to report the truth and, as a result, the client's top management never did find out just how far behind their projects already were, even in the beginning. Anyway, after a while, reality did begin to raise it's eyebrows, although it never exactly lifted it's ugly head. Most large organizations, including my client and, more especially, the Federal Grabit, began to realize that all of their systems would not, in fact, be ready for the rollover rectifinium. In this respect the Grabit was by far worse off than, for example, the major banks. They had literally billions of lines of old, poorly documented code running on vast numbers of obsolete and non-compliant computers. In addition, they had extremely limited in-house expertise and a pathetic track record of 100% failure to meet deadlines and costs in previous software projects. I seriously doubt that it ever was possible to fix more than 10% of the Grabit's code, a figure which is very roughly borne out by the number of systems they claim to have fixed. (The Fortune 500 were much better off with perhaps an average 60% fixable, but still nowhere near all of it). Obviously this could never be admitted either by the Grabit (scared of the sheeple) or by big business (scared of the shareholders). Ever the leader in lying, it was the Grabit which came up with the first deception -- changing the size of the problem from the experts' meaningful billions of lines of code into the seemingly smaller and utterly meaningless measure of thousands of "systems". Bureaucratically brilliant because this vague term can mean a single piece of hardware, a collection of hardware, a single software program, a group of programs on a single piece of hardware or even a group of programs running on many pieces of hardware connected across multiple different networks. It can even be used to describe a manual operation in which computers are not even used. Obviously, such a loose term could easily be adapted to whichever lie of the day they wished to issue. For example, at one point, the US Navy claimed to have made compliant a "system" which consisted of a few guys sitting around drinking coffee in an office, but no computer at all. The real beauty of this "system" measure, however, is it's utter meaninglessness when it comes to estimating the amount of work to be done in remediation. The total lines of code (LOC) is a meaningful measure because it directly correlates to the amount of work which needs to be done in inspection, modification and testing of the code. But, the number of LOC varies enormously from one system to another. I have worked on systems with as few as 2,000 lines of code and others with several million. In addition, to be really useful, a valid measure would have to include weighting factors for things like language, complexity, age, documentation quality, etc. By using the "system" as a measuring unit, the Grabit willfully and intentionally made it impossible to accurately measure their progress or, more honestly, their lack of progress in Y2K remediation. It was then easy for an individual agency, such as the FAA, to say that 90% of their systems were fixed with only 10% to go. When Jane Garvey told that particular lie, it is far more likely that they had only fixed their smallest systems, with perhaps 25% of the total code lines. The remaining 75% of code lines hadn't, and probably never will be fixed. Disgustingly, it didn't stop there. The next great deception was the invention of the mythical "mission critical" system. This was necessary because, given the vast extent of the Grabit's problem, even the use of a dishonest measure could not by itself disguise or cover the magnitude of the impending failure and the monumental incompetence of those who had allowed it to happen. Taking the sound advice of the real experts to concentrate on the most critical functions, the venal vassals of verbal prevarication rapidly spun and twisted this essential, emergency approach into an absurd and totally dishonest concept in which only a few "mission critical" systems needed to be repaired in order for the entire Grabit to continue on its merry way of monstrous maladministration. Needless to say most of the fortune 500, desperate not to lose the confidence of their shareholders and customers even before Y2K, were only too happy to follow the Grabit's dishonest lead. This is why most companies continue to hand out meaningless but glowing reports of progress on their mission critical systems, while remaining largely silent about the remaining, unprepared systems. In fact, on the global scale of Y2K, there is no distinction in failure effect between mission critical and non-mission critical systems. All systems, without exception, perform work that is considered useful by their owners. If that were not the case, none of these "non-critical" systems would ever have been created. In my more than thirty years experience I have not seen a single business or government system project which was not justified in advance by some kind of cost-benefit analysis. Lose even one of those systems and the global economy as a whole loses the benefits provided by that system. In most cases, the "benefit" we lose is the productivity gain achieved by implementing that system. Lose 50% of all the systems and we lose 50% of the benefits and increased productivity provided by all of the systems we have produced in the last thirty years. Effectively, we lose 50% of that part of the economic growth of the entire planet which is attributable to the computer revolution, beginning around 1970. Calling the other 50% "mission critical" does not change this mathematical fact, even if they continue to work. In reality, the situation is much worse. In most cases, we no longer have the capacity to replace the failing systems with manual operations. This means that when a system fails we lose not only its productivity gains but also the entire underlying economic activity itself. In addition, because of the interconnected nature of all economic activities, only a fairly small percentage of failures is needed to cause a near total failure of all global economic activity (as described in Charlotte's Web). Exactly what the critical failure rate is nobody knows, including myself. But I am certain it is far lower than the 65% global system failure rate I am expecting (averaged from 90% failure rate for governments, 30% for big business and 70% for small business). I suspect the critical rate is about 15%, but that's probably academic now, since such a low rate is no longer attainable. That's the good news. The bad news is it can't be fixed, not even in thirty years. We have taken one giant step forward and now we are about to take two equally giant steps back where we came from. In 1970 we started out with a fully functional, even robust economy and it still took thirty years to get where we are now. After the collapse of Charlotte's web there won't be enough of an economy left to pay for the repairs and replacement systems we need for a full recovery. Without either computer or manual labor to perform their functions in the short term, large numbers of businesses and even entire economic segments will cease to exist (particularly in the service sector). This is a simple fact and emotional appeals to human ingenuity and necessity can never change it. Creating or fixing computer software takes lots of time and lots of money and we won't have much of either. Remember, historically, after major information systems disasters, 50% of medium and large businesses declare bankruptcy within a month and 90% within a year. And that is under ideal economic conditions, with widely available capital and manpower for recovery. Since so much will be lost, the global economy will no longer be able to support the current population of six billion souls. Millions are going to die because of lack of jobs, money, food, water, sewage, medical treatment, etc. Let alone by violent acts and wars arising from the devastating economic conditions. This again is a simple fact and emotional appeals are not going to change it. As they die, there will be fewer people to sustain even what's left of the global economy and it will shrink still further, leading to more deaths, and so on. This is the beginning of the devolutionary spiral I have long been predicting. We're not just going back to a 1930's depression, we're going back to the middle ages., if not worse.

End of Part 1

-- Alan (alandonnaj@aol.com), December 31, 1999


Moderation questions? read the FAQ