Quotably Quoted #59 - Part 1

greenspun.com : LUSENET : TB2K spinoff uncensored : One Thread

I was wrong, it WILL be worse

A couple of years ago, I put forward the idea that the combined effects of Y2K and the collapse of the stock market bubble would lead to a "devolutionary spiral", somewhat analogous to the fall of the Roman Empire and the beginning of the dark ages. At the end of one of these articles I concluded "Of course, I could be wrong. It could be worse" In another article I showed, mathematically, that the Y2K problem could never be completely solved and that the residual failures resulting from the fixes themselves would be enough to cause significant economic impacts, even without failures from all of the unfixed systems. I also predicted, perhaps overoptimistically, that the stock market bubble would burst in the first quarter of 1999. I hereby admit I was wrong. It will be worse.

I was wrong about the timing of that collapse, but not about the state of Y2K or about the eventual crash of bubble.com. I am so very, very sorry it didn't collapse when I predicted. Not because I want it to happen, but because the delay means that the final crash will indeed be much, much worse than it needed to be. If the market collapse had occurred at the end of 1998 or beginning of 1999, as it logically should have, there would have been a full year for everyone to revise their expectations and to make major economic, budgetary and business planning changes before Y2K itself slams us in the first quarter of 2000. A collapse back then could still have been partially controlled, the market did not have so far to fall, and the first business failures and unemployment situations could have been handled with a mostly functional infrastructure and support systems. In short, we would have been forced (and able) to make realistic, useful contingency plans instead of relying on the nonsensical, irrational and mathematically impossible concept that Y2K has actually been fixed.

It is now completely inevitable that Y2K and the bursting of bubble.com will occur at one and the same time, leading rapidly to a new Great Depression far worse, far deeper and far longer lasting than that of the 1930's. Because of the confluence of these and other events, I am now also convinced that we will also see a complete breakdown of the global monetary system, a series of vicious dictatorships and, within ten years, World War Three. Instead of 66% over fifty years, I now expect more than 90% of the world population will die within the next ten years. But, again, it could be worse. If a major nuclear exchange cannot be avoided in World War Three, the human race will quite simply cease to exist. Under present global political conditions, I consider this to be a very real possibility, if not yet a probability. Either way, as a species, we have lost our one chance to get off this planet and to survive, long term, as a civilized race among the stars. (I am referring here to my own Devolution theory, to the Olduvai theory, to astronomer Hoyle's similar theory, and to other scientifically based approaches which predict that we have only one shot at developing a successful technology based civilization).

I know this will likely get the pollys jumping up and down on their perches but, frankly my dear, I don't give a damn. The time for debate ended about a year ago and there is no longer any useful purpose to be served in countering the polly arguments -- their damage has already been done, the lives of innocents have already been recklessly endangered, and previously possible recovery efforts have already been sabotaged by the shortage of bits in their processors. We'll let them squawk later, when we clean out their cage in the bubblegate trials. Until then remember that, although able to repeat complex sentences over and over again when carefully taught, polly's comprehension of the related subject matter is limited by an intellectual capacity roughly equivalent to that of a three year old child.

To regular readers, who know that I rarely insult my opponents, I apologize for the above. I also apologize to my own parrot for dragging him down to this level. But, after all the vicious attacks so many of the pollys have made on me over the last few years, just for once, I couldn't resist the temptation. It's about as close as I normally get to losing my temper, so please forgive me, because this article is not aimed at them but at those of you who are prepared for Y2K -- in response to the many eMail requests I have received to "come out of retirement" and present an update of my views. I am happy to do that, as a onetime event, just seven days before the rollover. But first, I'd like to tell you why I dropped out and where I've been.

In the fall of 1998 I decided not to take any more Y2K contracts because I had already reached the conclusion that at that point remediation was a lost cause, and the only viable (partial) solution to Y2K was a massive contingency effort. It would have been dishonest for me not only to charge for work I professionally considered to be useless but also to allow a client to continue in the false belief that remediation could actually solve the problem. I have had talks with a few clients and some bodysnatchers about Y2K audit and follow up work, in the hope that I could do something useful to alleviate the awesome consequences of Y2K. Nothing serious came of this, however, in large part because so few organizations have done enough remediation to be worth auditing and validating. And of those that have, so many are still laboring under the delusion that fixing individual software problems is a meaningful act, while continuing to ignore the real problem of the dependencies and the inevitable collapse of the system of systems itself.

At roughly the same time, I also renewed contact with "NymphoMagic", a lady friend from my misspent youth. And so, after a few hunting trips and a well earned winter vacation, with my own Y2K preparations largely in place, I decided to take a sabbatical and leave the digital dastards to their own devices. I spent most of this year in Northern California, renewing old friendships; sitting under a shady tree with a glass of fine wine; cooking my own special version of Beef Wellington over a charcoal fire; enjoying the last days of my faithful but ancient Golden Retriever; working on a Java based Bible study program; and making my peace with God before facing the fall of Babylon and the great tribulation. In short, I had a life and enjoyed it.

I even found time to occasionally follow the flame wars of CSY2K. But I must be honest and admit that I didn't find much worth responding to. As I said before, the useful debate was over about a year ago. We don't know for an absolute fact who won, and we won't for another month or so, perhaps as much as three months or even a year. But neither side will concede anything more than they already have. Our case is based on facts, logic, mathematics and prior historical experience, which the pollys simply reject out of hand. The polly case is based on nothing but emotion, corporate and government lies, greed, good "feelings" and an irrational belief that nothing could possibly go that badly wrong because the inimitable "they", the "authorities", will always fix it. It's the same, age old conflict between good and evil, right and wrong, right and left, nature and nurture, reason and emotion, freedom and serfdom, Christ and antichrist. It won't be settled until one side or the other starts dying in greater numbers. That won't be long now, but until then why waste any more energy arguing?

For those, then, who are calmly preparing for the gathering storm here is my current weather forecast (not to be confused with Cory Hamasaki's Washington Weather report). Let's begin with the state of Y2K itself.

In my expert professional opinion, the global prognosis is worse today than it was a year ago. Progress on remediation has been slower than even I expected and in many areas we have clearly regressed. In particular, the residual failure rate in "remediated" systems appears to be higher than I originally predicted. IV & V results show that the majority of these ostensibly repaired systems, almost all in fact, still contain potentially disastrous date errors, capable of crashing the system. In large part, this is due to the fixed deadline, the inevitable delays in the code change projects and the consequent lack of time for adequate testing. Even those best prepared, those who have been testing for two years or more, are still finding errors in their systems. These best of the best organizations tend to lower the overall error rate per million LOC but, obviously, this means that the residual error rate is much higher than average for those less well prepared than the best. Sadly, this applies to the great majority of all organizations and most of their so-called compliant systems will enter the new year still containing disastrous date errors.

Closely akin to the residual errors, but far more insidious, are the errors remaining from inadequate and incomplete assessment and remediation, particularly when these tasks were performed by personnel not properly qualified to perform them. I have a good friend, an army Lieutenant, who is responsible for Y2K in the administrative section he commands at a local base. Discussing the issue with him last year, I found that he had checked the real time clocks on all of his PC's and reported himself compliant, honestly thinking that was all he had to do! Even after I pointed out that his real problems were things like software, spreadsheet formulas, database programs, macros and historical data; and all of these would have to be tested; he still had no intention of doing so. As far as I know he still hasn't properly tested his systems and he's still reporting himself compliant. In my experience, this situation is the rule rather than the exception in small organizations without direct, professional IT support. It makes a mockery of the claims of compliance by many government departments, by small businesses and, to a somewhat lesser extent, even by large, technically proficient corporations. If even we professionals admit we can't fix the Y2K problem, amateurs don't have the slightest glimmer of a chance.

To make matters worse, many of the system replacement projects are also failing. We have seen new system failures at brand name companies like Hershey's, Royal Doulton and W.L. Gore. Businesses have been losing millions in lost revenues, even before the rollover, because these new systems (which may or may not be compliant) are simply not capable of replacing the functionality and capacity of the old systems they are supposed to be replacing. At least, not without considerable further development, refinement and testing, for which there is no more time. Similarly, many of the new federal, state and local government systems are failing the acid test of real world deployment. As we approach year's end, the reports of these failures are increasing, just as I predicted. But I strongly suspect that the number and scope of these failures is really much larger than their "owners" would have us believe. In retrospect, it has typically taken months for news of the failures to become public. Logically, the most recent failures, which are indeed occurring in greater numbers, have yet to be reported and probably won't be until next year. Ask yourself, how many of these failures have been openly reported (not even one) and how many have been covered up and lied about until the very last moment? How many replacement system project managers are just now reporting the failure of their project to meet the deadline? How many desperate, doomed attempts are currently underway to patch up the old systems they were supposed to replace?

Finally, there are the systems that nobody has even tried to fix. The ones their owners are supposedly going to "fix on failure" -- the most ridiculous, brain dead solution of all. This category includes the vast majority of systems worldwide, probably more than 90% by number and certainly more than 50% by economic activity. Small businesses alone account for half of all our economic activity and yet almost none of them have done any meaningful testing or remediation. Most have done absolutely nothing, some waiting to see what actually fails and others still in total denial that there even is a Y2K problem. Of those that have addressed the problem at all, most have not made use of professional help and are probably in the inadequate or improper remediation group (as are most local governments). Very, very few small businesses are likely to have really fixed their problems in advance. In fact, except for my own, I don't know of a single local business I could honestly pass as ready for Y2K. And sometimes I'm not too sure about my own!

Big business and large government agencies are no better off. Some really have fixed almost all of their problems, because they started early, planned well, financed and then staffed their projects adequately and allowed years for thorough testing in a complete, expensive, real world test environment. All they have left are a few residual errors as described above (although even these could still prove disastrous). However, such organizations are very few and far between -- probably less than 5%. So few that I cannot think of a single organization which I personally know or accept on their own assurance to be more than 70% ready, not even the clients for whom I have done Y2K work. The claims of compliance by the great majority of large corporations, and all government agencies (without a single exception) are nothing more than willful, flat out, bald faced lies based on the myth of "mission critical" systems and intentional or culpable misrepresentation of the actual remediation data. Let me explain this with a little history, for those who are not familiar with large scale systems development.

Except for Antisocial Insecurity, which got started early, the US Grabit and most of the Fortune 500 didn't wake up to the reality of Y2K until the very beginning of 1997. (Smaller companies, most States and foreign governments and almost all local agencies were even further behind). At that time I was working for a large beverage company and I know for a fact that their top IT managers, with whom I worked, were largely unaware of the problem and it's consequences for the company. At this point, with only three years to go, it was already too late for most large organizations to fix all their systems in time to meet the only truly immovable deadline most computer professionals have ever had to face.

Through the grapevine, those of us who actually understood the true scope of the problem (mostly system software and long range planning specialists) already realized the task was larger than the available time and resources. Regardless of our individual clients and employers, and working back from 2000, we all knew we needed at least a year for integration testing, a minimum of a year or two for all the application software changes and basic testing, another year to set up and test the extra development and test facilities, a year or two to bring the system software up to date, another year or two for system inventory, evaluation and planning. Let alone the minor tasks like finding and training the extra staff or contacting and verifying the compliance of customers and vendors. Even with parallel tasking, the remaining critical path simply could not be completed in time.

For the system replacement approach the situation was even worse since, from initial start to finish, implementing a package like SAP is a minimum five year process for a large organization. In addition, before we could even start to look at the details of the problem itself, we had to convince non-IT management that the problem was real and extremely critical to business continuity (in Y2K terminology, this phase is known as "awareness"). For many, this phase alone took six months to a year and most large organizations, including the major bank for which I later worked on Y2K, didn't really get started with remediation until well into the spring of 1998. In short, for most large businesses and government agencies, totally fixing the Y2K problem never was a realistic possibility and the vast majority of compliance claims are and always have been a lie. From the very beginning I am certain that almost all of the real experts were completely honest about this and recommended to their management that the Y2K effort should be focused on the most critical and profitable business functions and the rest should be dealt with by contingency plans -- including the orderly shutdown of areas which could not reasonably be fixed in time. Those of us with a little more insight even recognized that, spread over the economy as a whole, the controlled and uncontrolled shutdown of unfixed and unfixable business functions would inevitably lead to a minimum of a severe recession and advised an adjustment of the overall business plan to take this into account. I know I did, even though this led to my leaving more than one Y2K project.

No, the decision to lie about Y2K was not ours. It was a political decision made at the highest levels of business management and government. The first lie they told was to themselves -- that the problem could still be fixed, if only enough pressure could be brought to bear on the stupid peons who actually write the software. Quite simply, in arrogance and ignorance, management didn't believe what they were being told by their own experts and they simply dictated an unreasonable schedule to fix everything anyway. In so doing they drastically reduced the effectiveness of Y2K remediation (by spreading time and resources too thinly) and virtually guaranteed there would be no accurate reporting of real project progress, no feedback on what was really happening. On more than one occasion I was personally asked (actually ordered) to falsify my time records to show that I had been working on a later phase of the project when in fact I had still been working on an earlier phase whose completion deadline had already passed. Certain levels of management above me (other than my own immediate manager) were simply too scared (or too ignorant) to report the truth and, as a result, the client's top management never did find out just how far behind their projects already were, even in the beginning.

Anyway, after a while, reality did begin to raise it's eyebrows, although it never exactly lifted it's ugly head. Most large organizations, including my client and, more especially, the Federal Grabit, began to realize that all of their systems would not, in fact, be ready for the rollover rectifinium. In this respect the Grabit was by far worse off than, for example, the major banks. They had literally billions of lines of old, poorly documented code running on vast numbers of obsolete and non-compliant computers. In addition, they had extremely limited in-house expertise and a pathetic track record of 100% failure to meet deadlines and costs in previous software projects. I seriously doubt that it ever was possible to fix more than 10% of the Grabit's code, a figure which is very roughly borne out by the number of systems they claim to have fixed. (The Fortune 500 were much better off with perhaps an average 60% fixable, but still nowhere near all of it).

Obviously this could never be admitted either by the Grabit (scared of the sheeple) or by big business (scared of the shareholders). Ever the leader in lying, it was the Grabit which came up with the first deception -- changing the size of the problem from the experts' meaningful billions of lines of code into the seemingly smaller and utterly meaningless measure of thousands of "systems". Bureaucratically brilliant because this vague term can mean a single piece of hardware, a collection of hardware, a single software program, a group of programs on a single piece of hardware or even a group of programs running on many pieces of hardware connected across multiple different networks. It can even be used to describe a manual operation in which computers are not even used. Obviously, such a loose term could easily be adapted to whichever lie of the day they wished to issue. For example, at one point, the US Navy claimed to have made compliant a "system" which consisted of a few guys sitting around drinking coffee in an office, but no computer at all.

The real beauty of this "system" measure, however, is it's utter meaninglessness when it comes to estimating the amount of work to be done in remediation. The total lines of code (LOC) is a meaningful measure because it directly correlates to the amount of work which needs to be done in inspection, modification and testing of the code. But, the number of LOC varies enormously from one system to another. I have worked on systems with as few as 2,000 lines of code and others with several million. In addition, to be really useful, a valid measure would have to include weighting factors for things like language, complexity, age, documentation quality, etc. By using the "system" as a measuring unit, the Grabit willfully and intentionally made it impossible to accurately measure their progress or, more honestly, their lack of progress in Y2K remediation. It was then easy for an individual agency, such as the FAA, to say that 90% of their systems were fixed with only 10% to go. When Jane Garvey told that particular lie, it is far more likely that they had only fixed their smallest systems, with perhaps 25% of the total code lines. The remaining 75% of code lines hadn't, and probably never will be fixed.

Disgustingly, it didn't stop there. The next great deception was the invention of the mythical "mission critical" system. This was necessary because, given the vast extent of the Grabit's problem, even the use of a dishonest measure could not by itself disguise or cover the magnitude of the impending failure and the monumental incompetence of those who had allowed it to happen. Taking the sound advice of the real experts to concentrate on the most critical functions, the venal vassals of verbal prevarication rapidly spun and twisted this essential, emergency approach into an absurd and totally dishonest concept in which only a few "mission critical" systems needed to be repaired in order for the entire Grabit to continue on its merry way of monstrous maladministration. Needless to say most of the fortune 500, desperate not to lose the confidence of their shareholders and customers even before Y2K, were only too happy to follow the Grabit's dishonest lead. This is why most companies continue to hand out meaningless but glowing reports of progress on their mission critical systems, while remaining largely silent about the remaining, unprepared systems.

In fact, on the global scale of Y2K, there is no distinction in failure effect between mission critical and non-mission critical systems. All systems, without exception, perform work that is considered useful by their owners. If that were not the case, none of these "non-critical" systems would ever have been created. In my more than thirty years experience I have not seen a single business or government system project which was not justified in advance by some kind of cost-benefit analysis. Lose even one of those systems and the global economy as a whole loses the benefits provided by that system. In most cases, the "benefit" we lose is the productivity gain achieved by implementing that system. Lose 50% of all the systems and we lose 50% of the benefits and increased productivity provided by all of the systems we have produced in the last thirty years. Effectively, we lose 50% of that part of the economic growth of the entire planet which is attributable to the computer revolution, beginning around 1970. Calling the other 50% "mission critical" does not change this mathematical fact, even if they continue to work.

In reality, the situation is much worse. In most cases, we no longer have the capacity to replace the failing systems with manual operations. This means that when a system fails we lose not only its productivity gains but also the entire underlying economic activity itself. In addition, because of the interconnected nature of all economic activities, only a fairly small percentage of failures is needed to cause a near total failure of all global economic activity (as described in Charlotte's Web). Exactly what the critical failure rate is nobody knows, including myself. But I am certain it is far lower than the 65% global system failure rate I am expecting (averaged from 90% failure rate for governments, 30% for big business and 70% for small business). I suspect the critical rate is about 15%, but that's probably academic now, since such a low rate is no longer attainable.

That's the good news. The bad news is it can't be fixed, not even in thirty years. We have taken one giant step forward and now we are about to take two equally giant steps back where we came from. In 1970 we started out with a fully functional, even robust economy and it still took thirty years to get where we are now. After the collapse of Charlotte's web there won't be enough of an economy left to pay for the repairs and replacement systems we need for a full recovery. Without either computer or manual labor to perform their functions in the short term, large numbers of businesses and even entire economic segments will cease to exist (particularly in the service sector). This is a simple fact and emotional appeals to human ingenuity and necessity can never change it. Creating or fixing computer software takes lots of time and lots of money and we won't have much of either. Remember, historically, after major information systems disasters, 50% of medium and large businesses declare bankruptcy within a month and 90% within a year. And that is under ideal economic conditions, with widely available capital and manpower for recovery.

Since so much will be lost, the global economy will no longer be able to support the current population of six billion souls. Millions are going to die because of lack of jobs, money, food, water, sewage, medical treatment, etc. Let alone by violent acts and wars arising from the devastating economic conditions. This again is a simple fact and emotional appeals are not going to change it. As they die, there will be fewer people to sustain even what's left of the global economy and it will shrink still further, leading to more deaths, and so on. This is the beginning of the devolutionary spiral I have long been predicting. We're not just going back to a 1930's depression, we're going back to the middle ages, if not worse.

But, just for the sake of argument, let's say that I am wrong and the Good Feelings Fairy gave Santa a whole bunch of perfect Y2K fixes to drop down every chimney, fixing the entire problem just in the Nick of time! Was there one in your stocking? It wouldn't make any difference if there was. Even if everything really were completely fixed and there were no residual errors left, Babylon will still be fallen, be fallen. Ironically, this is because and in spite of attempts to avoid the consequences of Y2K by our beloved pollyticians and economystics. As usual, they just made the problem worse.

For example, to avoid liquidity problems and potential bank runs before the rollover, Alan Greenspan has printed billions upon billions in new dollar debt. If Y2K doesn't finally burst bubble.com his monopoly money certainly will. Ignoring the immorality of fiat currencies and fractional reserve banking in general, it might be normal to add reserves slowly when the economy is expanding, but over the last few weeks, as we approach year's end, Mr. Greenspan has pumped billions of inflationary dollars per day into the banking system just for Y2K (and to prop up bubble.com). Presumably, his assumption is that these reserves can then just as easily be withdrawn when January passes with no significant Y2K failures. Not only is he wrong about the failures, he is also wrong about withdrawing the reserves. It is much easier to print monopoly dollars than to withdraw them, especially at this time when the dollar currency itself is on the verge of collapse (more about this below). Even if there were no Y2K failures at all, by the time he thinks it safe to withdraw these dollars, he will find they have already cycled into the broader economy, starting a new period of the dreaded inflation and, by inference, finally bursting bubble.com and unleashing all of those related consequences.

You see, it really is the economy, stupid (to quote our faithless leader). It always was, and Y2K itself was never anything more than the trigger, in spite of it's own massive potential for disaster. I've often seen Y2K compared to the Titanic, a doomed ship waiting to sink us all. More accurately, Y2K is the deadly iceberg sitting silently in the ocean waiting for its unsuspecting victim (and what we have seen so far really is just the tip of the iceberg). Aboard the good ship U.S.S. Enterprise it's "full speed ahead, Mr. Greenspan, and damn the icebergs, we're unsinkable". While in the first class lounge the bubble.com band plays endless soothing songs for the sheeple, oblivious to their coming shock and terror.

With the iceberg dead ahead it's already too late to change our course. The time and place of the collision are fixed and the outcome is already determined. In spite of the Captain's unfounded optimism, the good ship Enterprise is not so strongly built and it certainly isn't unsinkable. The protecting plates of the hull, and the rivets holding them together, are brittle and ready to break under the slightest unexpected pressure. After they do, the watertight compartments will not work as designed and we'll find there are not enough lifeboats for all of us on board. But the bubble.com band will keep on playing, until they sink with the ship and most of the dancers to their deadly siren song.

In the real economy, the bubble.com insanity is the most obvious immediate danger and its inevitable collapse will have immediate repercussions for everyone on the face of this planet. It is often compared to the "irrational exuberance" which preceded the 1929 crash and the Great Depression. But there's really no comparison at all. Real measures of value, like P/E ratios, earnings and dividends, show that this is by far the most overvalued market since the beginning of the industrial revolution, far more bloated than in 1929. It is quite probably the biggest bubble in recorded history. When it bursts, as all bubbles always do, the effects will be worse than any crash in history, worse than the Great Depression, and they will last for decades if not centuries. This is true even if there were no Y2K problem and there were none of the other economic problems I describe below.

As with Y2K, there are no reasonable chances for a short term recovery. In 1929, the economy was driven mainly by manufacturing and agriculture whose products were still required after the crash. Today, it is driven by services which will be much less in demand. In 1929, we still had good money, backed by gold. Today, we have monopoly money backed by nothing more than promises to pay debts which really never can be repaid. In 1929, we had a very small national debt and no welfare state. Today, we have massive government debt and a millstone welfare state which together suck the majority of the revenues raised by the Grabit. In 1929, taxes were low enough that revenues could be increased by additional taxes (like Antisocial Insecurity) without worsening the problem or starting a revolution -- we could still spend our way out of the depression. Today, total taxes are in excess of 50% of income and cannot be further raised without further damaging the economy and actually reducing revenues.

In 1929, the general population was reasonably well educated in practical matters, were able to maintain their own homes, cars and equipment, and could hunt and gather much of their own food. Today, the masses have been dumbed down by an educational system which teaches little of practical value, very few are now able to perform the most rudimentary maintenance on their vehicles and even less know how to hunt or grow their own food. In 1929, the average family was a solid unit, with many children and all able to rely on each other for support. Today, many are single mother units, with few of the traditional male skills, low incomes and a high dependency on the welfare state. In 1929, people were generally peaceable and respected their elected and appointed authorities. Today, Y2K terrorism is a real possibility, children have no qualms about shooting each other in school and there is little love or trust for a Grabit which has grown far beyond control and no longer serves a useful social function. In 1929, the average worker walked to work (or rode a bicycle). Today, he lives miles away from any commercial or industrial center and cannot go to work without a car (and the endangered oil it takes to run it).

Lest you think this is too far in the past, and things are "different" now, let's compare to a more recent example, Japan in 1990. Japan had a positive balance of trade, ours is increasingly negative. Japan had a positive savings rate, ours is negative. The bubble in Japan was relatively small, ours is the largest in history. Japan has an industrial economy whose products are mainly useful, ours is a service economy whose products are largely superfluous in a belt tightening situation. The Japanese people are generally well educated, hard working, respect authority and are unarmed. Ours are generally ignorant, lazy, dependent on Grabit handouts and are likely to use their own or stolen weapons to plunder and murder their better prepared neighbors when the Grabit can no longer bribe them with welfare handouts. The global economy was not dependent on the Yen, it is dependent on the dollar (because of the need to buy oil with dollars, as we shall see). Japan still had a foreign market when their bubble burst. We won't.

With all of these advantages, ten years later, Japan still has not recovered. Even the claimed turnaround earlier this year has fizzled out, with continued contraction of their economy in the third quarter. Their projected, minuscule 1% growth for next year is nothing but pie in the sky nonsense, especially given their extremely late start in the Y2K races. Now consider that after 1929, even with a smaller bubble, and lots of monopoly money inflation, it still took more than twenty five years for stock prices to recover their pre-crash values. But, we are told, this is a new "paradigm" and all we'll see is a tiny bump in the road. Nowadays, we really know how to precisely control and fine tune our economy, even though the intelligent Japanese, using the same bankrupt economic theories, completely failed to save their own economy just ten short years ago. According to our new age counselors, if we just leave our money in the market it won't crash and any "corrections" will be small and rapidly recovered. I think not. The only "pair a dimes" I'd put my trust in were minted in silver before 1964.



-- Andy Ray (andyman633@hotmail.com), September 17, 2000

Answers

from Charlie Reuben Dallas(Home of the Meyerson Symphony Hall,the US' BEST)

RE: All XBASE Problems in the PC and the Cost to fix.(25-50 Billion???)

Part I Intoduction to the PC X BASE Problem:

No one on these Year 2000 study groups is looking at the real problem for the PCs. As they say every 4 years here, "Its the APPLICATIONS,dummies!!! Not the programs. It was what was done to those programs that have built up a mess that is a scaled down version of the MAIN FRAME Business Application Problems. And...its worse because there is some agreement on what you can do to isolate and identify and fix the Big Boys. There has been little said or done that even indicates that ANYONE is even looking at the most wide spread mess of all, the X BASE problem.

The question of the PC/MAC/Small User Problems vis a vis The Year 2000 Problem has not been quantitated properly because the problems have not been evaluated in depth. Given the magnitude of the Global 2000 woes, this is understandable. There has been much press on the Gartner Study claiming 300-600 Billion for the larger Enterprise fixes. Little has been said of the micro-bombs in the Micro Computers (the PCs/Mac etc). Using conservative extrapolations from several data sets, I arrive at a minimum cost to Small Business of a range of 25 to 50 BILLION Dollars (with a mean of 37.5) to get these houses in order. I "believe" it will be double that range if properly measured maybe more. Much will be passed off as "upgrading" because for small businesses there are tax advantages that large Corporations can not utilize if they "upgrade" just for Year 2000. (See Yo' Countant).

I define Small Business as businesses with less than 100 employees. The US Census says there are 38 Milllion of these. The Dallas Chamber of Commerce says there are 76,000 in Dallas County Alone(2 million souls ((mostly all perfect Angels)) ). If ten per cent of those companies spend only $10,000 "fixin" (Texas Tech Speak) that be...76 million. Lets look at the 38 million. Same thin,Lucy.... $10,000 X 3.8 Million. Let me help you...3.8 Mill X 10 =38 mill X 10 = 380 Mill X 10 = 3.8 BILLION X 10= 38 BILLION.

Look at it from the MOST elemental way. Dallas County (Chamber again) 1-4 employees: 41,947 firms. Is it reasonable to assume that a 'business person" might be willing to spend $10,000 on Computers? If she knew she had to file electronically and the computers made her money??? Forget all the people who "cant afford it" and think only of the 10% who WILL PAY.. I think it quite reasonable that a small 1-4 person company would lay out $10,000 to get working in 2000 but I will hedge my bet knowing that 1/2 of those will be history by 2000 and so I'm really betting that 1/5 will pay that kind of money. (Two desktop machines a laptop, software and some 'var" help). If they already have the machines and software and choose to "modify" the apps. who gwine do it??? The VAR who wrote the Air Conditioning Service Software and only charges $5,000 a copy?? Or FTD who charges every Florist X dollars to use their Software etc.

Its a true "pay me now or pay me later " deal. So considering ONLY 10% of the very smallest Companies in Dallas County we have: 4,195 X the $10K we still have almost 42 million. Plus TAX. (Dallas County pays for its Year 2000 problem from that alone,I hope).I will calculate this another way later.

At the heart of the cost will be the modification of or migration from any Customized Software Applications. Not the BIOS problem and NOT technically defective Software. Recently, some efforts to catalog the Generic Software have been made. They MISS the Central ISSUE. For, it is in the modification of the Generic Software (whether it be a Language or a "type" of software such as XBASE or SpreadSheet) that small and mid Range and even Enterprise level that Year 2000 problem willl arise. I do not think that the cost of the X Base problem has been included in most of the Year 2000 studies, yet the applications infest many large companies and must be fixed. Given the nature of Corporate Non Disclosure policies in most Companies, such proprietary information is not "given out". I can only think that if there are 8 million "seats" for Lotus "NOTES" there must be 5 times as many customized "things" sitting in the Big Companies.

In small companies, they don't even have a clue yet. NONE. Zippo.

In X BASE...there is NO Global FIX..NONE..

I repeat: there is no GLOBAL FIX and...there are zillions of problems. Zillions is a technical term of a Matrix of millions of application programs multiplied by a matrix of millions of users (Note: its really a TENSOR but lets not get that technical). The Ultimate Night Mare of the I.T./I.S. Pros came true....'The USERS GOT LOOSE".

Unlike the "Big Guys" the Little players don''t know yet. AND.. worse, their problem will be more costly and possibly not fixable. WHY?? Simple. The Big Guys Problems are really analogous to that of a Boeing or Lockheed Jet craft user problem. The Airlines have internal experts, external consultants, Manufacturer's consultants and expertise. More Brains and Bodies can be hired. At the other end of the spectrum, there are the car owners of the PCs with many Makes and Models all "customized" for the users, with many different kinds of driving styles and paint jobs. The Computer is the 'Universal Machine'. Give it a problem and a way to do something and it will work on it. It is "silly putty" for brains. Soon it will "replicate" and truly make us humble as it does winning at Chess. But...in the PC world you have all these machines and all these programs but NO FACTORY TRAINED MECHANICS Certified for Year 2000. There are NO University Courses in Year 2000 and prior experience is not on the job discription. It never happened before. What's a toy maker or a chain of dry cleaners to do??? There are only "hints" in the Heloise like columns of the PC mags. NO real 'sink your teeth into the code" solutions. To know the X BASE problem, you have to be able to read the code and that is not your Mother's bedtime story. AND..you really have to know whether it "can" be fixed. And..you have to know whether or not you can get the "data out" in a meaningful way. It goes without saying that you must back up everything. All financial stuff is needed for the you know who anyway. IN its original form. (maybe or maybe not,see your Accountant).

In the very worst case scene comes the compiled software for a nice little company whose VAR retired. The Source code is gone to the Dumpster in error. The applications all have 2 YY date displays and ALL manipulations in the machine see only 2 YY. These are the Cinderella Class : "DEAD DUCKS". Gonzo. They will fail or lock and the data will not be retrievable. Think it can't happen. Ask any computer person what "losing the pointers means". If the program looses its "reference points, It either stops and locks up or loops forever or simply "gives up" (crashes). You boot it up again with your back up and same thin... Lucy. but worse because now the backup is gone. Can't happen??? Take it from me the proud owner of $20,000 worth of zeros and ones when a "proprietary accounting package" lost its pointers. On a less stressful note, the BIOS problem of the Micro Machines themselves is well known. A new retrofit chip or machine may solve that problem. IMHO,a better safer fix for many will be the Software Patch written by Tom Becker (formerly of Dallas) while he was here in Dallas. Becker's program is available at www.RightTime.com or by link from my home page sites. Chip upgrades across large applications should be done by professionals. Because no machines were really "Y2K" enabled much before 1996 (save for the claim of Gateway) and there is an installed base of 100 Milllion or so machines the choice for businesses is simple. Install a new Bios, use Becker's solution or enter the date in January of 2000 and pray. Lets say that only 1/3 the installed base choose to use the former two. At a cost of $50 per fix plus labor at $50 (you clearly can't let Steve Salesman do it because he spills coffee) thats only some 3.3 Billlion. Where do the rest of my mean 37.5 Billion get spent?

PART II

Consider the X Base Problem. There are 8 million in the dBASE installed base per Borland's numbers. dBASE's Data base "engine". Lets triple that to arrive at a modest amount of copies of data base software in the field including "compiled applications and double that again to reflect the multiple number of applications per desk.

MOST if not all of the 'custom" software applications use date field information in some manner just as COBOL applications do. Since dBASE,Foxbase, SuperBase,Paradox and Clipper amoung others were "easy to work with" readily 'adaptable" and had the data base at their core, these were often the center of development for VARS and "business partners" (B.P.s).aka: "Solution Providers". There was also a great deal of Basic at first and then C and now of course, C++. (JAVA is next).

In the "off the rack" application department there were programs like ACT and Quicken that were built one way or the other that were not Year 2000 Compliant (both are now according to reliable reports and migration is easy). Technical flaws in other well known programs have been corrected to implement Leap Year and Year 2000 4 YYYY changes. For all practical purposes, these were probably not mission critical applications (except that a lot of business could be lost if data in a Contact Manager were lost to a sales or Marketing person.). Again, migration and labor might cost $100 (though if I had a site license for something that was defective I personally would want it for free hint hint to Gates and MS). So doubling the 3.3 Billion again is probably in order. Where's the other 30 Billion???

X BASE... and all those wonderful '"integrated" customized programs that are literally everywhere.Some even wrap all the business applications of one industry for a company. and the "VAR" was clever. He made his "solution" scaleable so the company with 1,000 employees is using the same mess that the ones with 50 or 100 use. The big bird paid for all the others and the rest was sheer gravy. Many,many of these were written in dBASE III,IV OR FOXBASE. In many cases, the programs were implemented in the mid or late 80s and are still working their hearts out. ALL in mm/dd/yy.

The biggest adventure for these was to move them to Windows 3.1. (such a thrill I had to pass on until 1995).

And..note,,,it is NOT Borland's fault. They built a perfectly wonderful machine (for me anyway) and its been "compliant" since when..1985??? What's the problem??

Consider the following, in ALL the XBase engines there is a choice of settings for the display of the time and date. The US standard (to our regret now) has been MM/DD/YY. Has anyone you know EVER entered the year 1996 in a program except for a letter to your Mother in a Word Processor??? USERS are fools. :Programs need to be FOOL proof. Whence the 2 YYs. Why give the fools a chance to enter 1896 or 1886. Besides, in the Mid 80s those 2 extra YYs added up and since they were all 19 why bother. As long as the "Engine KNEW".what was the harm.

Enter the problem stage left. IF you enter any date to 12/31/99 the machine sees Dec.31,1999. Enter 01/01/00 and the MACHINE SEES Jan. 1, 1900. THAT is critical. Proof of this is simple and proof of what the machine sees is simple.

Enter 02/29/00 and you WILL GET...."invalid date" or some such message..BECAUSE there was no leap year in 1900. Test further and enter leap year 1904 and it goes down smooth. Now go back to the command line function in your data base (if you can) and write {SET CENT ON} and proceed as above. The leap year 2000 will go in and all seems well.

So...to the LAYMAN or even the non PC Computer Guys the next question is "What's your problem? Do a "global fix"." AND that is EXACTLY what people on the Net Mailing lists think is the end of the X BASE problem. It is not.

Part III: A Case Example:

Consider:

Super Widgets, Inc. makes 1,500 line items of stock widgets and customizes some 5,000 more for special customers with 500 Manufacturer's reps sending in orders by fax,phone and telex. Sometimes one of the more advanced uses Email. The process begins. Somewhere along the way, Sam Super decided to "computerize" so at the Country Club bar he found out from the bartender that I.M.A Schlocker specialized in Small Computers for "important business men like Mr.Super". I.M writes up a full scale integrated package for all of Sam's departments and pretty much does a bang up job using Zippy BASE. (I love Borland so I don't want to say dBASE or the Foxy word). Many of them look a bit like this:

------------------------------------------------------- ---- ** foolish.prg ...prints a list of foolish things business men have to do ona given date;
** written in Zippy Base by I.M.A.Schlocker 4/1/87;
** Property of I.Schlocker and Co.
Call for special programs at my day job at
7-11 (212-555-1212);
** Modified by E.B.Schlocker II 4/1/92 ;
"------------------------------------------------------ ------------" clea all;
set colo to y+/b;
set century off;
use foo inde foo,foo2,foo3;
do evnmofoo ;
@ 20,20 say "Printing a list of today's items for you";
repo form foolist.frm ;
clos all
end;(if evnmofoo was a routine)
end;
return

NOW...the part I want to stress is that for each and every little one of these here guys, inVARiably there IS a date field and inVARiably , the neighborhood VAR set the Cent off (just in case) to make sure that some two fingered typist didn't have the chance to muck things up much.

Never did anyone exit a program and set Cent ON. So, you have in every possible way shape and form, applications written "to order" for some "entity" or "market" each with its own peculiar way of doing things, names of fields, methods and procedures which internally depend upon those names and data types to produce whatever was supposed to come out the other end of the machine.

appplication TWO: mostly P Code:
order widget
issue ship orders
issue invoice
post A/R
monitor A/R
call for check when late
post A/R paid
post G/L

if we just look at: issue invoice: ship stuff
shipping notice and or bill of lading
terms: 2% ten net 30 days 1.5 if over 60 days retro to 30.
(pretty standard)
integrated with A/R

Here we see the need for the use of 'memory variables mem vars" If those Mem Vars depend on a 2 YY date field and can not read the 4 YYYY field we need to rewrite this application. And..we now open up an entire can of worms,especially the more tightly knit the entire "integrated package" was built. Since every name and field may have some use in other programs and modules, tweaking Super Widgets nose can make his feet hurt or even drop off completely. This is what your Pascal teacher was trying to tell you about "goto"s. Now that we have "objects" and OOP maybe this kind of thing will be gone forever. I don't wager on such things. Nobody every lost money betting on the bad judgement of people.

If we instead choose to "migrate" we need to know that NOW. because for MOST Companies...there is time to "convert/expand the date fields" so it can be "pushed into" the new application,tested and fielded. Soon, that window will not be there and for the PC users there is NO TIME. NO HELP and NO precedents for this kind of thing.. Who you gonna call???"Data Busters"????

Part IV: The Costs

Who has to "do this" fixin or migration to new software ??? Well, just like the Big Birds. YOU do. There are two ways to calculate this.

ASSUMPTIONS:

1. The cost of buying new software, modifying it and migrating data and users to it will probably be the same as "fixin" (Texas tech-speak) the OLD or MORE.

2. The cost of inventorying, analyzing, "fixin" and testing/fielding old increases as the number of users and size of the company increases. This is probably exponential or sigmoidal. Small companies with less than 20 employees and a "modicum" of expertise would do well to consider migration and burying the cost as a "capital improvement" (see your Accountant). This is not an option for large companies who must expense Y 2 K "fixin".

ROUGH GUESSTIMATES:

APPROACH ONE:

Assume all "iron" stays in place. Cost of Bios,mandatory replacement of obsolete software (old versions of ACT or Quicken or 123), plus modification of code of vertical Xbase Software. $1,500-2000/per machine/user. Stir in 'user downtime" administration costs" the whole bit. I think that's "Cheap" if it represents 1/2 hard costs and 1/2 soft. What's it cost to "train" a user??? I'm cheap because you just give me the stuff and I use it. Joe Stumble in sales needs what??? $500 in training. CompUSA did $350 Million in 'Corporate Training". They charge to teach you how to use what they sell you. If "the boss" teaches, in a very small company what's that time worth???? In actual billing or sales time vs. lost time?

Lets use the widely published "# of PCs used in business". : 100,000,000. world wide. IF...only 1/4 do anything about Year 2000 plug in the numbers. and LO and Behold...some sort of range: 37.5 to 50 Million. (Note: the per centage of people using PCs for business who will do something is greatly elevated from the 10% given at the start of this paper because for any PCs in ANY serious sort of application in a Big or Mid Range Company there is "no question" that they HAVE TO DO IT or risk down time in production. Period. That would imply that 90% or more of any PCs in Corp.hands would be made Year 2000 compliant. So the only difference here is the ownership of the machines.This further implies that the cost estimates given are probably far too low.)

APPROACH TWO:

New Iron and Software. New programs. Try... the SUN MICRO numbers. A PC costs 10-12 K per year for a big Company. What is the "incremental" cost if you have to Consider Year 2000 because you are "forced to"?? You have the same costs as above in "soft costs" Plus you have the extra cost of "migrating what has to be migrated" and storing what has to be saved for the IRS and Your Industry Standards. Lawyers may save "forever". Doctors and medical too. So, is the "incremental cost" forced by Year 2000 reasonably the same as above or more??? Given the trick of "depreciating it" by making it an "investment" instead of household monthly cost it probably works out the same but see Yo' Accountant). (SIDEBAR..I'm real good for a Real Estate person. I send people to Lawyers and Accountants two groups known as "deal killers" to the Unwashed members of my BusinessTribe.)

So again, we have the same results more or less (probably more)...a mere 37.5 to 50 BILLION DOLLARS. All to be spent unwillingly for something that we can't even moan about or vote on (like Congress).

Best,
Charlie Reuben, Charles P. Reuben BS,MA
Hancock Properties DALLAS
8600 NW Plaza@Hillcrest S:3-C
Dallas,Tx. 75225-4210
214-369-9502, or f:369-9337

E-ME: texasrltr@aol.com
Year 2000 Links http://members.aol.com/texasrltr/buytexas/index.htm
(temp.pages)
Member: GDAR,TAR,NAR, and
..the Gt.Dallas Ch.of Commerce


-- (xxx@xxx.xxx), September 18, 2000.


NEAT.

MY 1996 to 1997 PC Xbase web page. I guess EVERY LAST OWNER **DID** SOMETHING.

LOLOL

-- cpr (buytexas@swbell.net), September 18, 2000.


Moderation questions? read the FAQ