NERC Redux - the saga continues

greenspun.com : LUSENET : Electric Utilities and Y2K : One Thread

From the CPSR Y2k mail list comes some independent validation of my NERC analysis:

Hello CPSR -

I have been on your distribution list for about 10 months and have very much appreciated the insights offered by your members. I am an independent IS Consultant with 20+ years in the trade, mostly in the management of large projects, having run a 300+ staff consulting group at one point in my career with contracts up to $25 million.

A recent assigment for a local hospital involved researching the "facts" about what has been coming out of NERC, and Washington in general regarding the electric industry. Of particular interest to this client is the likelhood of disruption to the Northeast power industry. After spending about a week crawling inside the NERC website, reprots and spreadsheets, I have come to some very disturbing conclusions. The "percent complete" methodology followed by NERC follows no standard project accounting method I've experienced. It is misleading to the point of absurdity.

To excerpt from my report for this hospital client:

Percent Complete vs. Percent Done

The NERC reports rely on an average of the percent complete reported by the individual participants and the average estimated completion date. As of November 30, 1998 these averages for the three major tasks NERC has chosen to report are:

Task   Ave. Estimated Completion Date  Ave. Percent Complete
 Inventory       	8/25/98                  96
 Assessment     	11/16/98                 82
 Remediation/Testing     6/6/99                  44
The averages are calculated as the total percent complete or date divided by the number of respondents. This number also combines all components of the electric industry across all geographic regions such that a generating plant in Idaho that is ahead of schedule will offset a distribution company in Maine that is behind. As discussed earlier in this report, there are fewer alternatives in the transmission and distribution components and that progress in one component does not replace slippage in another.

The average dates reported are also misleading. If the goal is remediation of all mission critical systems by a certain date, then monitoring the average completion date is inconclusive in monitoring progress. Every day ahead of schedule on one component should not be used to offset slippage in another component.

The meaning of these averages is further confused by some specific instructions from NERC. The spreadsheet states:

% Complete - Report as amount of work completed in each phase divided by total amount of work to do in that phase.If no remediation and testing is required in an area that was inventoried and assessed, then show remediation and testing as 100% complete.

This instruction has the effect of overstating the percent complete of a participant in the remediation/testing task. Percent complete as used in the NERC reports is the percentage of systems that have been tested,not the percentage of the Y2K work that has been accomplished. There is a major difference between the two. For example, if a company has 20 systems and 10 of them did not require any remediation, they would report 50 percent complete. This is extremely misleading as it implies that the 50 percent remaining requires the same effort as the work completed.In actuality, no inference about the size, scope or schedule for the remaining work can be made. The NERC November report states that 44 percent (on average) of the systems have been tested, it is not necessarily true that the remaining 56 percent will require the same effort or can be completed at the same pace.

It is likely that the remaining work will require more effort and resources than what has been reported to date, as many of the completed systems required little or no remediation.

The NERC spreadsheets do not contain any indication of the number of systems that will require remediation, or the number of components that have been assessed, determined to be non-compliant and must be replaced.

There is no estimate of the scope of the work left to be done in any of the NERC reports. Without this information the likelihood of the completion dates being met cannot be directly determined or even estimated.

(End snip)

I would very much appreciate comments from others in CPSR. Has anyone else look at this??? It looks like political spin at its worst. Richardson, Sec of Energy was quoted in the NYTimes "We can be cautiously optimistic about the prospects for the industry meeting its Y2K challenge'' and that "tests and repairs are now more than half done".

More than 1/2 done??? The details in the NERC report simply do not support this.

Rich Hawkins Kingston Consulting, Inc. www.kingstonconsulting.com

-- Anonymous, February 06, 1999

Answers

Excellent commentary by Rich Hawkins. And after hearing the cop-out, play the game, hope this helps my court case statements by Bill Gates this week, I needed some inspiration. With regard to the NERC, I seem to remember Bonnie questioning the same practice of averaging percentages to arrive at the amount of work completed in the industry as a whole. You know what they say, numbers don't lie, but accountants do.

By the way, I tried to get into the URL above listed for Kingston Consulting and couldn't get through. Is this for sure correct...www.kingstonconsulting.com?

-- Anonymous, February 06, 1999


Meg,

That URL worked for me, and led to

http://www.kingstonconsulting.com/

Jerry

P.S. Yes, excellent commentary by RH. And thanks, Rick, for posting it here!

-- Anonymous, February 07, 1999


I would point out this post confirms what Paul Milne and I were at pains to argue here and on the Yourdon thread about a month ago:

"For example, if a company has 20 systems and 10 of them did not require any remediation, they would report 50 percent complete."

I will say we were lambasted a bit, as I recall (!), though I don't say that in a spirit of complaint, seriously. I remain as dismayed today as I was a month ago by the NERC "methodology." I am sorry to say that this was the promixate cause of my own realization that Y2K compliance percentages, in general, are mainly bogus.

The bogus figures don't establish (indeed, they cannot because they are bogus) that the utility industry will fail, but they provide no ground for optimism.

The fact that the mass media and the government continually cite these NERC figures and similar unaudited "we feel good in the industry" statements as "evidence" of success, is quite gross ... to be blunt about it.

-- Anonymous, February 07, 1999


Responded to this over on the EY board, but since it's being propagated here as well:

The numbers are not "bogus", the analysis is. The example of including already compliant systems in the percent complete directly contradicts the explicit instructions from NERC:

http://www.nerc.com/~y2k/assessment.html

"A standard method for determining per cent complete follows. It is preferred that your responses be geared to % work done compared to total amount of work to be done. This would account for various activities having different amounts of effort. For example: if your inventory shows 100 devices with possible Y2K problems, and your assessment shows that only 2 have Y2K problems and one device has been replaced with a Y2K ready device and the other still needs remediation, the per cent complete to report would be Inventory 100%, Assessment 100%, Remediation and Testing 50%."

AG

-- Anonymous, February 08, 1999


Well, there we have it--an apparent contradiction within NERC's instructions. The spreadsheet Mr. Hawkins mentions is at: ftp://ftp.nerc.com/pub/sys/all_updl/docs/y2k/august1998.xls.

A better instruction would be to mark "Non-Applicable" in the Remediation/Testing column, instead of "100%". Then it would not be averaged with the bona-fide percentages. Of course, the averaging method is itself faulty, because no weight is given to the amount of work in each of the 15 areas listed.

The effect of the zero-remediation areas on the total data may not be too great, however. This instruction only applies if there are *no* y2k-faulty devices in any of 15 categories of equipment listed. How often would that happen?

On the other hand, the instruction leaves an opportunity for misinterpretation, in spite of the other instruction cited by AG. Hence Mr. Hawkins' 20-system example. How do we know how these spreadsheets were actually interpreted?

-- Anonymous, February 08, 1999



I'm sorry, just don't see how the instructions would be misinterpreted.

From the spreadsheet, it says to state a percentage of total amount of *work* to do. Nowhere does it say to us total number of systems.

Combined with the actual instructions, with the specific example given, I just don't see any misinterpretation.

If the argument was based solely on the few (any?) that had *no* remediation work to do skewing the averages, that would be one thing. But that was *not* the basis of the analysis; it went to great detail describing a scenario that directly contradicts the NERC instructions.

AG

-- Anonymous, February 08, 1999


Rick --- Since you reposted this analysis here, would you comment on the points made by AG above? I am going to make some time to go look at the NERC data myself, which I have in fact been taking at second-hand.

I stand by my arguments about the fallacious nature of compliance percentages generally, based on 25 years of IT experience with "metrics", but I posted this on Yourdon last night and want to correct it if AG is himself correct.

-- Anonymous, February 08, 1999


For additional evidence of the flawed nature of the averaging methodology used in the NERC report, see Gary North's post:

http://www.garynorth.com/y2k/detail_.cfm/3825

-- Anonymous, February 09, 1999


Moderation questions? read the FAQ