Paging 'a': Your Read On Testing Status

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

a -- knowing your expertise in testing, can you give us a detailed heads-up on your current read of Y2K remediation status with testing progress in view?

This could include your take on when/how/whether apps will be frozen and what will happen (vis-a-vis remediation/testing) in the early months post-rollover.

Much appreciated.

-- BigDog (BigDog@duffer.com), July 07, 1999

Answers

Well lets see. As of July 1999 we have

So I would say that just as 30 years of IT metrics on software development don't fly out the window just because y2k is an important project, 30 years of lax testing procedures and protocols don't either.

As far as freezing baselines goes, I think most are between a rock and a hard place. If not enough of the fixes have been put in place, a freeze is pointless. On the other hand, if you fail to allocate enough time for proper regression testing before putting code back into production, you're screwed. For those shops that started too late, there is no happy ending to this story. Which is probably why the emphasis is shifting to contingency plans, as Cory has been recommending since day 500, and which large firms are now seriously undertaking.

As to how it will unfold, I don't know. I'm still at 50% chance of a Yourdon 10-year depression, 20% chance of a full blown Milne, and 1% chance of Infomagic devolutionary spiral. This prediction from coprolith on another thread is what I would envision:

July 1: not much happens. business as usual.

Jan 1: despite noble efforts, lots of mainframes croak. The electricity is flickery in most places but not blacked out. no cash in the cash machines, little if any food in the stores. lots of hangovers. in big cities there are armored personell carriers manned by people wondering what the heck the big deal is about. Most decide that y2k in the States was much ado about nothing, but eye problems overseas with a worrisome glance.

Jan. 3: big huge critical errors cripple 10-20% of all businesses. In 50% the businesses are struck by errors that slow but do not stop the work. Ripple effects speculated, and stock market go down 25%-30 by the end of the day. Air traffic grinds to a near halt with gridlocked flights. IT becomes obvious that we're going to run out of gasoline and refined petroleum products.

Jan. Ripple effects take hold, and businesses begin to fail. The stock market is half what it is now by month's end. Gas is rationed and it is too expensive to support all-but-essential auto use.

Jan - March: Solar flares, virii, or sabotage causes a massive blackout, rendered difficult to control or contain after previous y2k failures were swept under the rug. Civil disturbance ignites.

April: things are really beginning to spiral downward. Chaos. TAx collection is a joke. Electric power is rationed so industry gets it in the day and homes get it at night.

June-Dec 00': Things bottom out. Food becomes scarce. Unemployment is at 25-30%, but over half of the people had lost their 1999 jobs. A state of emergency is decleared. A major war breaks out somewhere in the world but the US is too sick to bother with it now.

2001-2006: This time is known as the 2nd Great Depression. Recovery is slow (at first) and happens in fits and starts before taking off like a rocket. Necessity enables some brilliant new technology to be invented which fuels a juggernaut recovery.

AFter that, who knows?

coprolith (coprolith@rocketship.com), June 13, 1999.



-- a (a@a.a), July 07, 1999.


Okay, a, thanks much (groan, more beans and ammo coming up) but here is a bit of an odd-creative question, from one techie doomer to another:

if we escape the 71% bad scenario, for what rational SYSTEMS-based reason would it be escaped?

That is, would it mean more had been fixed than we anticipated, more tested than described, the systems were more resilient or whaa?

And how does a GOOD result mesh or not with your own convictions about the complexity curves we are reaching with software in general? That is, describe a technical trajectory in 2000 and beyond based on a "good" result.

Again, much appreciated (hey, Cory, why don't you include some of a's testing expertise in an upcoming WRP?).

-- BigDog (BigDog@duffer.com), July 07, 1999.


Thank you for sharing guys.

Will

-- Will (sibola@hotmail.com), July 08, 1999.


BD: I think due to my pessimistic nature, I may be incapable of answering your question :)

However, if we have a "GOOD" result, all I can say is that I was wrong, and my nightly prayers to God were answered. Hopefully things are gonna be a lot more resilient than I expect, but the odds are heavily stacked against us.

Here is a reprint of my earlier post on complexity curves for newbie reference:

In the software development world, it is common knowledge that the first casualty of shrinking budgets and diminishing schedules, after Training and Documentation, is Testing. In my profession, I have seen the complexity of systems (function points, lines of code, number of interfaces, etc.) rise exponentially, without a corresponding increase in testing. The result has been buggier code, more severe slippage, and higher costs (sound familiar Mr. Gates?). Although I started warning my customers and managers about this phenomena about five years ago, I was always greeted with disbelief, ridicule and laughter. They aren't laughing anymore, but they still haven't grasped the concept.

é 
                                          O    
C                                      O       
O      O=ACTUAL                      O         
M      x=PERCEPTION OF MANAGERS    O     
P                                 O            
L                                O |           
E                               O  |           
X                              O   |           
I                             O    |         x 
T                           O      |    x      
Y                         O        x           
                       O      x    |           
                    O    x         |           
                O   x              |           
            O  x                   |           
       O  x                        |           
  O  x                             |           
 
1990             1995             2000
Figure 1. System complexity over time-projected without Y2K factor

Figure 1 depicts the way I expected this effect to manifest itself before I understood the systemic nature of y2k. I expected a continued rise in complexity (with a continued decline in testing), until a certain threshold was reached, and the curve would start to plateau.

é 
                                               
C                                              
O                                              
M    O=ACTUAL                      O O         
P    x=PERCEPTION OF MANAGERS     O    O       
L                                O |      O    
E                               O  |         O 
X                              O   |           
I                             O    |           
T                           O      |    x      
Y                         O        x           
                       O      x    |           
                    O    x         |           
                O   x              |           
            O  x                   |           
       O  x                        |           
  O  x                             |           
 
1990             1995             2000
Figure 2. System complexity over time-projected with Y2K factor

Figure 2 is my revised projected complexity graph which takes the y2k effect into consideration. The breakdown of technical infrastructure corresponds to the cusp in the graph around 2000.

My point in relating all this is that most people (and I'm talking engineers here at work) could not see the effect I described in Figure 1 coming, and are still not sure what it is they are experiencing. So the fact that there are engineers that do not understand the consequences depicted in the second graph is not so surprising.



-- a (a@a.a), July 08, 1999.


I hope we can get back on track within 6 years.

a, is the increase in complexity shown on your graphs compressed for ease of viewing? I would have expected the "manager's" expectations to increase in a linear manner, while the actual complexity to increase in a much faster, exponential or logrithmic fashion.

Just curious.

-- Jon Williamson (jwilliamson003@sprintmail.com), July 08, 1999.



In theory, and your post-Y2k graph reflects this, breakdowns might have the unintended positive effect of reducing the number/type of working systems and/or their complexity. This is a variant on the "we didn't really need all those systems anyway."

It would be ironic if Y2K is akin to nature's way of halting non-survivable ecologies. That is, we begin from near ground-floor with a "new" post-Y2K depression information infrastructure ....

One reason I pressed you to think of a "good" result is that, otherwise, your own 29% positive scenario is just wishful thinking, TECHNICALLY speaking. Of course, it falls out be exception from the other scenarios, but I had been sincerely hoping you had a theory to back the positive stuff up.

-- BigDog (BigDog@duffer.com), July 08, 1999.


Jon: yes that is what it is supposed to depict. Not scientifically, just from my observations.

BD: I agree with what you say re. y2k being a "safety brake" for the planet. But I'm confused at where you get the 71% chance of "bad" effects/29% "good"? IMO, we're looking at 99% bad/ 1% good, the only question is how bad.

-- a (a@a.a), July 08, 1999.


a --- Yourdon+Milne+Info added up to 71%, I thought? By contrast, I figure 29% is the "good stuff"!

-- BigDog (BigDog@duffer.com), July 08, 1999.

Oh I gotcha. I guess it depends if whether it's a good good :)

-- a (a@a.a), July 08, 1999.

Moderation questions? read the FAQ