Gartner report

greenspun.com : LUSENET : Electric Utilities and Y2K : One Thread

Very informative (though not very encouraging) article on "current global status" by the gartner group to the US senate subcommitee at:

http://gartner5.gartnerweb.com/public/static/aboutgg/pressrel/testimony1098.html#top

-- Anonymous, October 12, 1998

Answers

Response to GartnerGroup report

On page 10 of the report, they state "Embedded systems will have limited effect on Year 2000 problems, and we will see a minimal number of failures from these devices. Only 1 in 100,000 free-standing microcontroller chips are likely to fail due to Year 2000."

Perhaps Rick would like to comment?

-- Anonymous, October 13, 1998


Mike, I don't know about Rick, but I'd like to comment. First, I would like to know how the Gartner Group defines "freestanding" microcontroller chips. If we do the math, then of the estimated 25 billion microprocessors which will be in operation on 1/1/200, Gartner is saying only 250,000 will fail. That's .000001 of the total. Every other report from those businesses directly involved in the chip remediation process have said that they are encountering (overall average) a 3% failure rate. Three percent of 25 billion is 7.5 million chips - a far cry from 250,000. Not to mention that in some specific industries the percentage of problem chips has been found to be as great as 20%. Also, since the majority of respondents in the Gartner Poll have not even gotten to the point of checking their embedded systems, I am puzzled about how Gartner was able to make this prediction in the first place. The Poll respondents certainly couldn't have delineated a specific percentage of chip failures, since so few of them have addressed the problem in the first place! There is no factual substantiation given for the 1 in 100,000 statement, and unless Gartner has a crystal ball that can look into the programs of all the embedded chips and tell whether they're good or bad, the number they have come up with contradicts what has been reported by other "in the trenches" remediators. Until I discover the factual basis upon which Gartner made this prediction, I'll stick with the 3% number which has been documented by those who have actually pulled the darn chips for testing.

Of course, I suppose it doesn't matter all that much, anyway, since Gartner predicted 50% of companies in Chemical Processing, Transportation, POWER, Natural Gas, Water, Oil, Law Practices, Medical Practices, Construction, Transportation, Pulp & Paper, Ocean Shipping, Hospitality, Broadcast News, Television and Law Enforcement would have AT LEAST ONE MISSION CRITICAL SYSTEM FAILURE. I don't even want to go into the areas where they predicted a 66% chance of a mission critical failure. It's gotten to the point of "What do you want first? The bad news, or the REALLY bad news?"

Prepare for the worst. Hope for the best. Just don't hold your breath waiting for the good news.

-- Anonymous, October 13, 1998


Response to GartnerGroup Report

Bonnie, thanks for the reply! Just out of curiosity, where did you find the supporting statistical information used in your reply? Remember Ronald Reagan's words "Trust but Verify"

-- Anonymous, October 13, 1998

I congratulate the Gartner Group for their outstanding contribution to the world's understanding of the Y2K problems via the presentation by Lou Marcoccio to the US Senate on October 7, 1998.

I, like many, am very grateful for this fine research.

There is one aspect of the presentation that I do not understand and suspect there are many who also would appreciate some clarification. So, if anyone has thoughts please share them. My confusion pertains to the relationship in the report between on the one hand "COMPARE level (specifically percent completion of Level IV) and likely hood of a "failure" on the other hand. This relationship is critical to much of the report's conclusions about failure potentials and infrastructure interruptions.

To me these relationships seem wildly optimistic. What am I missing? For example a key fact to me is the conclusion on page 15 that half of large companies will be 80% complete with phase IV by 2000 in category I countries. (Where phase IV means: "Complete compliance of 80% of critical items" and category I countries are the twelve countries making the best progress toward Y2K compliance.)

Gartner concludes isolated and minor disruptions as a result in most aspects of the economy and society as a result in these Category I countries.

My confusion is - how can large numbers of huge firms with less that 80% mission critical systems compliant be expected to run with only occasional and isolated failures? In organizations that I am familiar with an absence of more that 20% of critical systems would seem to me to preclude running at all. Why does not this projection about the best prepared group of companies in the world not suggest a much more sobering and worrisome outlook?

-- Anonymous, October 14, 1998


Bonnie,thank you for the response to the Garter Group report. I Read it last night. I could not sleep thinking about there misleading report on embedded systems. I to would like to know how they define the term,"freestanding" microcontroller chips, and How they can say, In the U.S., we predict that general infrastructure, power, non- wireless telephones, and critical services will continue mostly uninterrupted, with potential for relatively minor problems and some inconveniences. As For (trust) I dont totally trust the Garter group. As for (Verify) I think your well on the way!

-- Anonymous, October 14, 1998


RE: 1-4% Chip Rate

Bonnie & Mike, I see estimates commonly of 1-4% of chips needing remediation. A recent example was in a discussion hosted by Ed Yardeni which can be found at: www.yardeni.com/hall1.pdf. The entire seven pages is interesting reading. Below is a key exchange from page 2.

E. Yardeni: Well, how about this 1 to 4 percent failure rate number? Is that something that you folks at your organization have seen to be the case? Is it higher, lower?

D. Hall: That's about a good average. We're running anywhere from 1 to 4. I've run into a few health care organizations and a few manufacturing plants that have possibly up to 10 percent, but those are the extremely integrated, automated type systems. So, the more you use PCs, basically, for running your systems, the more problems you run into.

-- Anonymous, October 14, 1998


Mike (Roman), I admire your prudent skepticism, as I have quite a bit of it myself. Thats why I have been researching the Year 2000 problem for months, and either printing out hard copies of pertinent articles, testimonies, and reports, or copying them to Word documents. Initially, I bookmarked the site links but discovered that most links are not open for very long, thus the copies. The statistical info I used was a mid-line extrapolation of all the statistics on the subject that I have run across. As youll see from the excerpts below, there are both higher and lower incident reports of both the number of embedded systems and the percentage which have problems. One consulting firm may find a 40% chip failure problem, while another may find less than 1%. All of these reports may very well be accurate, as the number of "bad" chips depends on the application they are being used in, where they were originally purchased from, how old they are, and whether they are "generic" chips or custom-programmed ones. Like Pat Harmon, I am also grateful for the Gartner Groups reports, because data from any source widens the scope of the whole picture, and its that "whole picture" which I am interested in. I cant post all the sources and information, because this will be very long as it is, and also because of the time constraints upon me in having to search through the hundreds of pages for specifics. However, I hope the following are representative examples:

From the Giga Information Group in Cambridge Mass., as reported in a Money.com article: "The good news is that probably only 5% of these embedded chips are date sensitive and thus subject to Y2K failures. The bad news: There will be an estimated 25 billion chips at work on the planet on Jan.1, 2000."

From a report commissioned by The Health and Safety Executive as reported on BBC June, 1998: "..there is a 10- 15% chance of embedded chips in safety systems in engineering processes failing in year 2000 unless action is taken to rectify the problem."

From PCWeek online, July, 1998: "Finding and fixing the date and time stamped into embedded systems is like finding a needle in a haystack, according to Martha Daniel, president and CEO of Information Management Resources Inc. in Costa Mesa, Ca. Last year, 7 billion microchips were shipped in the United States. Of those, 10% are unable to recognize the year 2000 date, Daniel says."

From the Los Angeles Times, Aug. 1998: Eric Trapp, head of the year 2000 program for Southern California Edison, said from 25 to 40 engineers worked for four months to pare down a list of 190,000 devices at the San Onofre Nuclear Generating Station to 32,000 electronic components and finally to 450 items that had some potential date connection. It will take the company another year to analyze those devices and fix the ones that will fail in the year 2000. . . .

The actual number of failing devices, according to many year 2000 consultants, is small--perhaps no more than 3% to 5%. The reason is that the vast majority of chips exist in a sort of timeless state where the particular date has no meaning to their tasks. . . . But even in chips that seem to have no outward connection with time, there can be hidden problems, since generic chips often include a time function whether it is used or not.

"Computers are like anything else," said Jerry Kilpatrick, project manager of the Central Illinois Manufacturing Extension Center at Bradley University. "They have accessories that not everyone needs. If one of the accessories is a real-time clock, the programmer may never use it, but its still there ticking away."

Even a tiny rate of failure can have an impact, given that there are an estimated 35 billion to 40 billion logic chips floating around. The high level of integration between devices in modern plants and factories opens the possibility of cascading failures, where the one faulty device starts a chain reaction by feeding confusing information to the next device. . . .

From the testimony of the Seattle-Tacoma Airport representative at the Senate Transportation Hearings: "..identified 115 embedded systems..For about a third of the systems, the equipment vendor claims to have a problem. For another third, the vendor says there is no problem, and for the final third, the vendor doesnt know or hasnt told the airport."

From the Computer Weekly News (Oct.2): Anthony Parrish, director-general of the federation of Electronic Industries said, "For every 1,000 embedded chips you look at, youll find two or three that need correction. The problem is finding those two or three that are not compliant."

From Peter de Jagers site by Professor Leon Kappelman: "What can be shared in on the SIM Year 2000 Working Groups online conference website. Most of the embedded systems I am personally aware of are in oil & gas, petrochemical, nuclear, communications, and power generation.. non-disclosures preclude my saying anything more specific than this. The significant problem rate is about 5% although sometimes as high as 50%. Many of the vendors are not able to offer much information and tests are usually required.""

From "The Year 2000 Crisis: An Enormous Challenge That Must Be Addressed", Strategic Analysis Report From the Gartner Group: (Bonnies note- I thought youd get a kick out of another Gartner Report) "More than 50 million embedded-systems devices worldwide will exhibit year 2000 date anomalies in 1999."

From the Chicago Tribune (March 2): "Industry experts predict that of the 25 billion chips installed in electronic components, only about 2% will fail...but they dont know which 2%."

From the National Radio Astronomy Observatory: "At a special briefing for Capitol Hill staffers set up by ITAA and the House & Senate IT Congressional Working Groups, Cara Corporation Embedded Systems Specialist David C. Hall stated that there are over 40 billion microprocessors worldwide, and anywhere form one to ten percent may be impacted by the date change."

From Kevin Lister, on year2000-discuss@year2000.com: "My organization runs a safety critical operation that has tasked an army of people to assess compliancy status of embedded systems. We have so far had a non-compliance level of approximately 3%."

From the U.K. bug2000 site: "From interviews with the top logic chip manufacturers, Harden estimates that approximately 5 billion of the 70 billion chips produced since 1972 are subject to Y2K problems. Andrew Bochman of the Aberdeen Group thinks the problem rate is much higher than Harden puts it. From what my clients are saying, Im looking at a trouble rate of about 20% of all devices containing embedded systems. And Aberdeens (Boston based IT company) manufacturing clients report spending three to four times as much on their embedded systems remediation efforts compared with their computer systems."

From the U.S. Coast Guard as reported in Datamation: "In a recent survey, the U.S. Coast Guard surveyed marine manufacturers and discovered 20 percent of the embedded chips tested were NON- Y2K compliant."

From "Investigating Embedded Systems for Y2K Compliance" by Robert J. Hammell, July, 1998: "Just because a vendor states that their equipment is millennium-compliant does not make it so. A conservative estimate from our consultants is that 40% of stated compliant equipment is not compliant. Embedded systems rates are much higher. Establish a minimum of an isolation test to check the veracity of all vendors."

From Robin Guenier : "Smith Kline Beecham bought two machines for monitoring and recording the performance of drug production. When they tested one, it handled January 2000 very well, and they were very happy. But when they tested the other (same machine, identical chips) it didnt. The scary explanation for the anomaly, when the firm checked serial numbers with the manufacturer, was that the chips had come from different makers, one of whom had made them year 2000 compliant, while the other hadnt. Documentation down to this level of detail is often not specified in the world of embedded systems. And these were machines that had been made last year."

From Gary North: "If the percentage of noncompliant chips is the same as the percentage of noncompliant lines of COBOL programs, then between 1% and 5% of the chips are noncompliant."

From the World Bank Group: "A specific and important area of concern as the Year 2000 draws nearer is the question of embedded computer chips. 25 billion of them are in use.."

-- Anonymous, October 14, 1998


With all due certainty, we can debate the percentage of non-compliant chips to the same degree that we debate the numbers of stars in the sky. The number and percentage is not known.

What needs to be known is the number of non-compliant chips within each operating system that is "mission-critical", especially within the realm of power generation and distribution. Therefore testing of devices must be pursued with more than all due diligence.

If the system is vulnerable to the anomoly presented by one faulty chip, it makes little difference if 1 chip in a thousand is found to have that fault or 20 in a thousand; except maybe the replacement costs factor. I personally would not sleep better at night knowing engineers were looking through 'haystacks for one needle instead of 20.'

What I do hope is going on is a lot of sharing of information between electrical companies about what is found and corrective measures that are used. We also need to know that systems are being corrected sequentially, so that some generating units function of 1/1/2000.

It is an age of deregulation in many parts of the country, but the date change cannot be used as a competitive issue in the electrical supply market. Hopefully we are past that by now, but I urge all to be in contact with you local public service commissions, urging them to 'stay on top of this.' Mr. Cowles is right about the need for pressure on local utilities. It may very well be the threat of a 'thousand cuts' that prompts power companies to quicken the pace of Y2K-readiness.

-- Anonymous, October 15, 1998


Bonnie, those are good thoughts on the Gartner Report. However, you need to be aware of a math error in your initial response, which -- as it turns out -- makes your argument even more compelling. Three percent of 25 billion is 750 million, not 7.5 million. I don't know how many of the estimated 25 billion chips have been identified and tested as compliant, but from what I keep reading the percentage appears to be small. If only half of the estimated 25 billion have been checked thus far, that means that in the next 442 days, another 12.5 billion chips must be identified -- and if critical -- tested, replaced and retested. That works out to be approximately 28.3 million chips per day. Of course not all chips are in critical systems. I am sure once you subtract those chips in microwave ovens, toys, answering machines etc., the 25 billion figure would be substantially reduced. Still, there seems to be a consensus that we simply cannot get to all the chips before the Y2K. I am curious as to how many chips will have to be replaced before and, particularly, after failure when and if power goes down affecting manufacturing, distribution, transportation etc. Which brings me to my question: Is anyone aware of any estimates of the amount of chips in "critical" systems throughout the US?

-- Anonymous, October 15, 1998

PS: I know we can define critical a hundred different ways, but for purposes of my questions, let's define it as a system that would cause power generation to go down, or water, gas, waste treatment, telecommunication etc.

-- Anonymous, October 15, 1998


Ralph,

My understanding is that one in every five embedded chips is found in 'mission critical' systems. So taking your number of 25 billion, one-fifth, or 5 billion would have to be examined. However, I believe the concensus on embedded chips currently in use is in the 40 billion range. That would translate into 8 billion chips serving in a critical capacity.

Since 'mission critical systems' are receiving the brunt of remediation addtention at this juncture, one would hope time is not being spent looking for compliancy in the other 32 billion chips. One would hope, anyway.

-- Anonymous, October 15, 1998


Ralph (Perales), thanks for pointing out my computing error. That's what I get for not using a calculator! (grin)

As to your question about critical systems, the only news articles I've run across which addressed any specifics in that area both applied to power companies.

"Brad Pence of the Omaha Public Power District (OPPD) has identified 2300 suspected systems that might have Y2K-related problems. He then states that 10% of them are 'show stoppers'. Show stoppers means you shut down the plant. The power stops, period."

Washington Water Power, a Spokane, Wash. -based electric and gas utility, tested 540,000 embedded components and found only 1,800 that contained year 2000 date dependencies, said year 2000 communications liaison Jay Hopkins. "Of those, only 234 have needed to be remediated," he said.

Hopkins said the Bonneville Power Administration, which controls about 80% of generating capacity in the Pacific Northwest, is "finding the same thing." But Hopkins conceded that there may be more year 2000 problems lurking in nuclear generating plants than in the older, relatively simple hydroelectric plants his utility has checked. That's partly because many hydro plants are old enough that they don't rely heavily on computers at all.

So Ralph, if my math is correct this time, then 234 fixes out of 1800 date problems is 13%. Since the article states they found those 1800 "date dependencies" but only "have needed" to remediate the 234, I'd say it might be assumed that those 234 were "critical". This percentage is in line with the 10% of show-stoppers in the OPPD.

I do agree with Charles (Register), however, in that whether it's one or 20 per thousand, they need to be found and fixed. I would add to his assesment that the replacement costs factor is the only difference, though. The time necessary to do those replacements (ordering, installing parts and testing them once they're in) is also an integral part of remediation. Thus the amount of replacements needed (number of bad chips) does affect the overall process and is pertinent in trying to determine if the job can be done in the less than 300 actual working days left before 1/1/2000.

-- Anonymous, October 15, 1998


Charles,

With all due certainty, I can tell you that prior to roughly June 1996 >95% of all IBM PC based embedded systems from all manufacturers (and there are lots of them) have the Y2K bug in hardware. The IBM PC/AT realtime clock (RTC) does not automatically increment the century value. The century stays at "19" and the year roles over to "00". The real sad part is that the century bytes are stored in the RTC CMOS RAM, they're just not automatically updated! Go figure.

Anyhow, not many (any?) of the embedded/realtime OS manufacturers were aware of the bug and hence didn't have it fixed. I personally was the first to notify one realtime OS vendor that their RTC utility miscalculated the year and roled it forward to "2106" (yes "2106") after Jan 1/2000. They immediately fixed it, but the point is that pretty much 100% of all PC based embedded systems installed before 6/1996 have the potential for failure.

--AJ

-- Anonymous, October 15, 1998


P.S. That's why control systems that use a lot of IBM PC/AT compatibles for both desktop frontend and/or for embedded systems have a much higher incedence of failure (20% - 50%) instead of the more global (1% - 5%).

If you are involved in the assessment of systems I would treat any IBM PC/AT based control system as suspect until proven innocent.

If the system is sophisticated enough to require the horsepower of PC than it's more likely to be using time/date. Also, just because the box isn't importing or exporting time/date doesn't mean it won't fail. Never underestimate the ability of a programmer make something complex and wonderful.

A final note, it is crucial in testing PC based control systems to power the system down before midnight 12/31/9999 and then power it back up after the roll over. There is nothing wrong with the PC _software_ clock, it will happily roll over to 1/1/2000 just fine. But the hardware clock has rolled over to 1900 and the two are _not_automatically synchronized by the OS.

What does all this mean? It means that any system based on IBM PC compatible technology that has not been remediated with the _correct_ tests has the potential for failure. But the kicker is this, it won't fail at the 1/1/2000 roll over, it will fail the next time it is rebooted which typically will be on the next power cycle of the system. So imagine a power plant that relies on one or more IBM PC compatible control systems. If those systems were not tested correctly with the proper power-cycle sequence test then it won't fail at midnight 12/31/1999 it will fail on the next power cycle. This may of course happen at the same time but it may not. It may not happen until a week later, or a month, or year, etc.

More and more PLC manufactures are building PC based systems. I sure hope they are aware of this problem.

--AJ

-- Anonymous, October 15, 1998


Andrew:

I defer to your judgment reagrding IBM PC-based systems. I claim no particular monopoly of knowledge in this area. My question, however, is: are these IBM - PC based chips used in the electrical industry, and if so, where? It is my understanding that embedded systems are not frequently updated within the electrical industry, so I'm curious to know what you think the percentage of pre-1996 IBM PC based systems percentage might be. Thank you.

-- Anonymous, October 15, 1998



Please forgive me Andrew. I did not see your second post prior to my most recent response. My aologies to everyone.

-- Anonymous, October 15, 1998

No harm, no foul. Yes, PC based systems are used _everywhere_. Everytime you use your Visa card it goes through a network of IBM PC's running the QNX RTOS. A lot of telephone equipment is based on IBM PC embedded systems.

For a small sampling of where PC based embedded systems are used check here: http://www.qnx.com/realworld/index.html

and here: http://www.qnx.com/company/compover.html#Customers

Eye opening?

Notice the reference to Atomic Energy Canada.

Remember this is just _one_ OS vendor. There is also VxWorks, VRTX, pSOS, iRMX, uCOS, DOS(!), many others and the bulk of embedded OS's are custom proprietary kernels that are not commercially supported.

Yes, they are widely used in the electric industry. Remember they don't look like a PC they look like any other embedded board with chips on it (Take a look at www.ampro.com click on "products" and then click on one of the "CoreModule" products. That's a full blown PC compatible CPU board that measures 3"x3"). You can stick them all over the place, and people do. Again, that's just one vendor.

A friend of mine is a program manager at Maritime Nuclear up in Canada. They use PC's in the monitoring of the Candu reactors (the control is mostly analog). They have investigated every PC in all power plants and they have been remediated. He is pretty comfortable that the Canadian Candu reactor plants will be ok. He is not so comfortable with the SCADA system and the EMS. He is also not so confident about the US Nuclear plants. But maybe that's just good ol' Canadian engineering prejiduce. ;-)

So between the Canadian Nuclear plants and the Candadian Hydro plants, Canada's looking like a nice cozy place to be...

--AJ

-- Anonymous, October 15, 1998


A.J. - Canada may not be such a cozy place to be after all. Check out this article that hit the web yesterday. I certainly hope there aren't many Y2K electric utility problems in Canada, because it appears they have enough problems without Y2K.

October 14, 1998

Alberta consumers told to save power to avoid blackouts

CALGARY (CP) -- As Prairie provinces scramble to find more electricity, the power industry in energy-rich Alberta told homeowners Wednesday to conserve to avoid blackouts this winter. Albertans are being urged to avoid using major appliances between 4 p.m. and 8 p.m., including block and water heaters, washers, dryers, dishwashers -- even Christmas lights. "I don't want to be the Grinch who calls off Christmas," said David Lewin of EPCOR, which distributes power in Edmonton. "But if lighting could be switched on at eight o'clock in the evening that would help greatly," Lewin added. Senior representatives of Alberta's electricity industry held a news conference Wednesday to forecast electricity shortages this winter that will cause blackouts lasting up to two hours. Power is also in short supply in both neighbouring Saskatchewan and Manitoba. Saskatchewan, which buys the majority of its imported power from Manitoba, is looking to purchase energy from the United States this winter. And Manitoba Hydro says unless it gets rain soon to fill up its hydro-electric dams it may need to seek alternative supplies. The shortage in Alberta has been blamed on the province's strong economy and increasing population. "The situation is difficult, but we believe it's manageable," said Dale McMaster, chief operations officer of the Power Pool of Alberta. "We have contingency plans in place and we believe this is a manageable situation for the winter," he added. One contingency is to buy more power from British Columbia, from a daily 600 megawatts to 800 megawatts, McMaster said. But B.C. Hydro claims it's impossible to transmit more power to Alberta because the system between the two provinces is already saturated. And it would take years to erect new transmission towers. "It involves having to build additional capacity on that line, which means it's not a short-term solution," said Stephen Bruyneel of B.C. Hydro. "I don't think it's a B.C. problem," said Rob Pellatt, secretary of the B.C. Utilities Commission. "It's an Alberta problem in terms of getting the power into Alberta." To curb blackouts, industrial companies that gobble large amounts of electricity are being offered rate advantages to go off the system at peak periods, Lewin said. Alberta's extreme situation seems strange to industry analysts. "To say that you're going to have blackouts is surprising to me," said Mark Jaccard, a Simon Fraser University economic professor. "There are ways of shutting down certain major customers in order to balance supply and demand," he said. Even with conservation and more imported power this winter, Alberta's power industry can't promise the province won't suffer blackouts beyond this winter. "There's no guarantees there," said Jim Beckett of Alberta Power.

-- Anonymous, October 15, 1998


At the risk of extending this thread, I would like to go back to the original topic and try to link all these discussions together.

In any type of research continuity is paramount, whether it be through research thoroughness or sufficient documentation from which to draw a studied conclusion. To the best of their ability, I believe the Gartner Group was thorough in its reporting at this stage.

Ms. Camp pointed out her concerns over the embedded chip issue by stating that assessment in that area by surveyed compnaies was not far enough along to warrant the "one in a hundred thousand" failure rate in freestanding chips. I believe that's fair. I'll go on record, with my limited knowledge of the power industry, and reject this figure out of hand as it applies to the electrical industry. I don't think 'freestanding' chips play much of a roll in power generation, transimission or distribution. (Feel free to correct me, though).

But to borrow a Paul Harvey phrase, "Here's a strange."

Lou Marcoccia, the orchestrator of the Gartner Report, testified before Sen. Bennett's Year 2000 committee on June 12, 1998 regarding his view of the progress of utilities. At the time, Marcoccia was consulting for large IOUs such as Duke Energy, Washington Gas and Baltimore Power & Gas. His assessment of the power industry as a whole was quite vitriolic. In his conclusion, he said:

"There are pockets of successes that exist within the industry. But, when I take a pragmatic look at the information I have seen, things I have experienced and people I have talked to in this Industry I can only conclude that the readiness of the Utility Industry in not acceptable."

It is evident by his testimony;

http://www.senate.gov/%7Ey2k/index,html (Look under Committee Priorities then Utilities),

that he has a command of the difficulties utilities face in rememdiation. In a later letter to the committee, he indicated that he had spent the last two years consulting for utilities.

However, in the span of five months he authors a statement that includes the prediction that there will be minor and isolated disturbances in electricity within the United States.

I offer no way to reconcile these statements. I see no advantage for him or his company to now downplay the impact Y2K will have on domestic utilities, since they are also in the remediation business.

Finally, and forgive me, I'm am still having problems reconciling the numbers in reference to Mr. Edgar's statements regarding the 95% faliure rate of pre-1996 IBM-PC chips with the current failure statstics offered by Ms. Camp. I do not doubt either post, but if Mr. Edgar's figure is accurate and the IBM-PC chips are used throughout the power industry, would that not account for a higher rate of falure than what Ms. Camp's post suggests?

I think it would be informative if Mr. Edgar could find out from his Canadian friend working at the nuclear facility just how many IBM-PC chips where replaced in relation to the total number of mission critical systems addressed during their Y2K upgrade. Thank you for your time.

-- Anonymous, October 16, 1998


Thanks for the summary Mr. Register.

My colleague is going to try and find out from his coworker what the actual numbers were (how many PC based systems, how many had time/date dependencies, and how many required fixing in order to prevent a failure).

At this point in time I can tell you that there definately were systems that had to be fixed. I just don't know what the numbers were. He also confirmed my rule of thumb, they are very suspicious of any PC system installed prior to 1997.

As far as mission critical is concerned, it really depends on the interpretation of the regulations. None of the control systems in the actual nuclear power plants have PC's in the loop. They're all based on "ancient" Varian equipment. However, if the monitoring system goes down because of a Y2K problem in a PC, do you keep the plant running or not?

--AJ

-- Anonymous, October 16, 1998


Thank you, Mr. Edgar. I appreciate you expertise and input in this matter. I, for one, am a just a concerned civilian trying to put this all together.

In reference to my previous statement regarding Mr. Marcoccio's apparent chaging testimony, Rick Cowles has a post in the most recent Gartner thread that might shed some light on the issue, for those you might be interested.

-- Anonymous, October 16, 1998


Moderation questions? read the FAQ