Embedded System Problems - Statistically Insignificant??

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

Let's see. Lately, we've had both Electric Utility companies and Oil and Gas results indicating the Embedded System problem, while there, is much less than anticipated. Now we have this quote from GIGA:


Some parts of the Y2K problem, such as its effect on embedded microprocessors--critical components that control automated processes in power plants and hospital equipment--have turned out to be far less pervasive than previously believed.

Giga Information Group was one of many companies that sounded the alarm, proposing that embedded microprocessors could be the most serious aspect of the Y2K problem because of their widespread use in critical industries.

But in January, the company released a new report, titled "It May Rain, but the Sky Won't Fall," stating that problems with embedded microprocessors "will not have the crippling effect as originally thought."

Alistair Stewart, a senior advisor for Giga on embedded systems, said that only about 3% of chips have been found to have minor problems, typically requiring resetting the date or restarting a device. The percentage of chips that experience outright failures is "so small as to be statistically insignificant," he said.

"There won't be a systemic shutdown," Stewart said. "You will have some localized inconveniences with some localized failures."

-- Hoffmeister (hoff_meister@my-dejanews.com), March 12, 1999


Hmmm...that's not what the ACS says.

American Chemical Society says Chemical Industries Underestimated Embedded Chip Problem

From: www.sciencedaily.com Source: American Chemical Society www.acs.org 1-22-99 Chemical industry efforts to keep the so-called Y2K computer problem from shutting down safety controls may be further behind than previously thought -- particularly at smaller chemical companies around the nation -- according to a report in this week's issue of Chemical & Engineering News, newsmagazine of the world's largest scientific society, the American Chemical Society. The report quotes Gerald V. Poje, board member of the Chemical Safety and Hazard Investigation Board (CSHIB), who notes that "one to three percent of some 50 billion embedded chips worldwide will be subject to Y2K problems and some 25 million mission critical systems may have problems." At chemical plants, such chips automate devices, (within chemical plants,) including the control pumps and valves that prevent spills and other hazardous accidents from occurring. Y2K consultants report that the small and medium sized chemical companies are at greatest risk, depending on their ability to spend enough time and resources to address the problem. Angela E. Summers, director of Premier Consulting and Engineering in LaMarque, Texas, says that a system using embedded chips could "fail dangerously." A safety system might not respond adequately, or it could "fail safely," she says, resulting in a costly shutdown and startup, but without incident. Experts have found that "even chemical companies that have actively addressed the Y2K problem may have underestimated its depth," according to the article in C. Consultants hired by Occidental Chemical found "10 times more systems with potential Y2K problems than the company's own engineers had found." At a recent CSHIB meeting in Washington, D.C., more than 50 experts from around the U.S. discussed possible solutions for the chemical industry's Y2K problem. One option that was discussed was to temporarily shut down computer systems at midnight on December 31st, 1999, and then restart them later, hoping systems would come back online without incident. Views of the CSHIB will be included in a report to a special Senate Y2K Committee later this month. The Environmental Protection Agency (EPA) has been given "no special authority to encourage companies to make Y2K investigations," according to Don Flattery, the EPA's Y2K project team coordinator. The agency has created a "tool kit" which provides advice to chemical companies and examples of other companies' approaches to the problem. "The CSHIB panel's consensus was that the country's focus should be on helping the smaller companies," but the help they get is more likely to come from the chemical industry rather than the U.S. government, according to the C article.

-- a (a@a.a), March 12, 1999.

Here's another for your critical reading enjoyment Hoff:

February 13, 1999, 08:10 p.m. Y2K petrochemical warnings sounded Houston-area plants race computer-driven clock to prevent disaster By BILL DAWSON Copyright 1999 Houston Chronicle Environment Writer As the nation's petrochemical capital, Houston faces a unique array of potential problems, ranging from the catastrophic to the merely troublesome, because oil and chemical plants are controlled with thousands of computer chips that may be vulnerable to the much-publicized Year 2000 bug. Industry officials are racing the clock to identify and correct plant systems containing date-sensitive chips that won't read 2000 properly. At the same time, companies are reviewing and refining their contingency plans in case they don't find all the problem chips and the computer glitch causes an emergency. With a flood of recent reports on the Y2K bug's threat in other computerized areas of modern life, the additional specter of fires, explosions and toxic clouds at petrochemical plants might seem like premillennial jitters or technophobia. In this case, however, the warnings are coming from people and groups more noted for their expertise in the industry's complex workings than for any tendency toward doomsaying, and who are taking care to distinguish their concern from alarm. "It's not a hoax," said Ray Skinner, area director of the U.S. Occupational Safety and Health Administration's Houston South office. "It's a real issue and something that's very, very important." Link and the rest of the story at: http://www .chron.com/content/story.html/page1/195526

-- a (a@a.a), March 12, 1999.

" The report quotes Gerald V. Poje, board member of the Chemical Safety and Hazard Investigation Board (CSHIB), who notes that "one to three percent of some 50 billion embedded chips worldwide will be subject to Y2K problems and some 25 million mission critical systems may have problems.""

This guy is obviously quoting the Gartner Group. So, you have a quote of a guy quoting somebody else without saying who he is quoting.

Where I work we have 2 Chem. labs with lots of equipment. We checked it all. Embedded system problems? None. Zero. Zilch. Not even 1%. Software problems? Yes. Solution: upgrade software.

-- Buddy (buddydc@go.com), March 12, 1999.

I can't cite a source at the moment, but I have read -- and it seems reasonable -- that the percentage of embedded chips subject to the Y2K problem (which was always pretty small anyway, though when you are dealing with ~50 billion of the buggers, it still can get pretty pervasive) will vary depending on the industry. Thus, medical equipment might have a higher percentage of embedded systems problems than devices used in some other application, say.

Just a thought to think about....

-- Jack (jspratr@eld.net), March 12, 1999.

Yep you're right Hoff, we've all being worrying unnescessarily about these chips, DOPE SLAP TO THE HEAD, thanks for putting my mind at ease.

-- Andy (2000EOD@prodigy.net), March 12, 1999.


I commented on Giga Information Group's "study" when it first came out. It's on this thread:


[begin January 19, 1999 post]

This could end up being a long post, but I trust you'll find it interesting.

Giga Information Group, a firm with connections to the GartnerGroup, has recently made an announcement downplaying the embedded systems problem. Certain phrases quoted in the article on this announcement, such as "embedded systems Armageddon", "the four horsemen galloping off in the distance", and "talk about the Great God Teotwawki" are used to call into question the level-headedness of those of us concerned about embedded systems.

What is the connection between Giga Information Group and the GartnerGroup? Here's a quote from the management profile page of Giga Information group...


"Prior to founding Giga in 1995, Gideon I. Gartner founded the $400 million GartnerGroup where he served as president, chairman, and chief executive officer until April, 1991 and remained chairman until April, 1992."

Of course, it was a special report from the GartnerGroup downplaying the seriousness of Y2K in the fall of 1998 that made many of us wonder just what was going on. In an article in November, I found out that the GartnerGroup report was released for public consumption, and not the GartnerGroup's usual business and government clients. Here's a link and quote on that...


"The GartnerGroup report is an unusual departure for the firm, which aims most of its work toward large businesses or industry groups. This report, a compilation of comments from 18 GartnerGroup analysts, is aimed at consumers.

"Cassell said the myths and fallacies circulating around the year 2000 problem led to the 15-page document.

"'We've been reading an awful lot of what we think are irresponsible statements that could lead people to do some unnatural acts,' Cassell said. 'People suggesting that everyone take money out of banks and convert it into gold or liquidate stocks.'"

Now we have an announcement from Giga Information Group that the embedded system problem is over-rated as well, which Web publisher Newsbytes calls a "study." The article's title is "Y2K Disaster Potential Overhyped - Study". Here's the link and four quotes from this...


"According [to] market watcher Giga Information Group, the year 2000 computer date problem will not cause an 'embedded systems Armageddon' as some have heard. 'It may rain, but the sky won't fall,' Giga said.

"The firm, in an announcement Friday, called for a 'common-sense approach' to the Y2K threat."

[skipping ahead...]

"Newsbytes notes small special-purchase processors have played a large role in some scenarios in which Y2K failures topple civilization. For example, in once [sic] sequence of falling dominoes, embedded track switching controls will cause railroads [to] fail to deliver coal to power generation plants. As a result, electric utilities -- which have Y2K problems of their own -- will shut down. As the power grid goes dead, telephones will stop working. Without communications the interlocked banks and international financial structures begin to fall and, ultimately, so does civilization.

"'Can you hear the four horseman galloping off in the distance?' remarked Alistair Stewart, a senior Year 2000 advisor with Giga's information technology (IT) practices service. 'It's easy to scare people with talk about the Great God Teotwawki.'"

[skipping ahead...]

"'Many industries, especially those that are heavily regulated have done significant work in preparing for the year 2000,' said Stewart in an announcement. 'They have prioritized what problems to handle first, and these definitely include essential systems that businesses, and their customers, depend on.'"

[skipping ahead...]

"'In the case of regulated industries -- medical devices, telecommunications, power generation, transportation, etc -- companies are accountable for delivering products and services that work.'"

Why does this article in Newbytes bother me so much? First, it implies that because they are regulated, public utilities are surely well on their way to having Y2K remediation done. True, the financial and banking system is the best prepared of all sectors of the economy to deal with Y2K. But, it is commonly known (you can verify this at Dr. Ed Yardeni's site) that public utilities are the LEAST prepared out of all of the sectors of the economy.

Plus, I had read an article just a few minutes earlier on how seriously the U.S. Coast Guard takes its embedded systems problem. You can see the article "Be Prepared for Y2K Surprises" about the Coast Guard at...


I had also recently seen this article on Chicago's embedded system problems...

http://chicagotribune.com/version1/article/0,1575,SAV- 9901010066,00.html

Bonnie Camp's analysis of why the latest NERC report may not be as much of a cause for hope as it seems...


and of course John Koskinen's quote from this article...


"'We are deeply concerned about the railroads,' Koskinen says. 'We have no indication that they are going to make it.'"

-- Kevin (mixesmusic@worldnet.att.net), January 19, 1999

[end of January 19, 1999 post]

-- Kevin (mixesmusic@worldnet.att.net), March 12, 1999.

Kevin, you beat me to the punch with that Giga tie in to Gardner.

An interesting closing comment in Hoff's post:

"There won't be a systemic shutdown," Stewart said. "You will have some localized inconveniences with some localized failures."

This is another version of current fedspeak -- "The whole country won't go dark, just part of it."

I think most people will conceed that the entire grid isn't going to down to stay down. Every embedded system in a chemical plant won't simultaneously fail, every valve in a refinery won't stick closed.

But statements like Stewart's are designed specifically to be misinterpreted. Disinformation. At least the more forthright fedspeakers are honest enough to emphasize the fact that outages are anticipated, rather than making blanket "feel good" statements.

-- De (dealton@concentric.net), March 12, 1999.

I would really like to know how the chips were tested. There are many ways to test chips. Was it real world testing? Was it visual? Was it date forward? Different tests give different answers.

-- Scotty (BLehman202@aol.com), March 12, 1999.

I would like to know which chips have been tested. Will someone please post a list? (A partial list will do. I'm not asking for an upload of I.C. Master, just a few examples to get me going). Thanks.

-- Robert Neely (robert_neely@ncsu.edu), March 12, 1999.

For a specific example of an embedded system failure in the petrochemical industry, check out the current issue of Wired magazine. In the article called "This Is Not A Test", the author describes a Texaco embedded chip test that resulted in a data failure in a Remote Terminal Unit, a device that measures oil flow in a refinery. According to Texaco, they're finding problems in 5% of its embedded systems and are spending $75 million to fix them. That's a lot of cash if this problem isn't serious.

-- cody (cody@y2ksurvive.com), March 12, 1999.

I'd take any embedded system statistics with a big dose of salt.

For one thing, these systems tend to be layered. It isn't easy to point to one independent item and say, 'this is an embedded system' in many cases. As an example, the ABS braking system in your car consists of a microcontroller at each wheel. These control braking pulses. These also report feedback (on locking, skidding, overheating, etc.) to a higher level microcontroller. This higher level controller manages different braking rates at different wheels to control swerving when the coefficients of friction vary among the wheels. In turn, this 'master' controller reports error conditions and maintenance information (among other things) to a microprocessor which handles multiple inputs from multiple subsystems. There is a heirarchy involved.

In general, the error rates in embedded systems being reported from remediation efforts depend on where the boundaries are drawn in defining where one system stops and the next starts, or defining which is 'the' system and which are only subsystems. Of course, the wider the boundaries, the higher the reported error rate (because there is more logic, more functionality, more software and firmware involved).

To remediators, the 'system' is generally considered to be the minimum replaceable unit. Depending on configuration, this can be as small as a single chip, and as large as an entire assembly line containing whole computers and thousands (or tens of thousands) of chips. We've changed out 30-ton machines, which needed to be considered as single 'black boxes'.

Highly automated facilities are a bitch to test. I guarantee there will be some very unpleasant, totally unexpected surprises toward the beginning of next year. But I couldn't begin to guess how many.

-- Flint (flintc@mindspring.com), March 12, 1999.

This is one of the best, all-around basic articles I've seen on embedded "chips"...



Problems lurk in more than just computers

By Douglas Armstrong of the Journal Sentinel staff

February 14, 1999

Embedded chips are the wild cards of Y2K.

Only a tiny percentage of them are expected to fail when the calendar rolls over into the next century after 11:59:59 Dec. 31. But there are literally tens of billions of these dedicated processors out there in everything from microwave ovens to airliner cockpit controls (a Boeing 777 has 1,000).

Some, obviously, perform critical duties. And, according to many experts, there isn't time to check them all and tell which are bad and which are not by the time the new millennium ticks ominously in.

One reason is that the programming in embedded chips is not always readily accessible for inspection. And there are hundreds of different varieties. It's like looking for burned out light bulbs in Las Vegas -- with the power switch turned off.

"Most of the failures will be nuisance issues," says Bill Thompson, senior analyst with Automation Research Corp., a consulting firm in Dedham, Mass.

Not everyone is so sanguine.

"The embedded systems problem is still a black hole," says Harlan Smith, a Y2K analyst who moderates an online forum on the issue at y2knews.com.

"Identifying the devices that are not compliant and assessing the effect of them on the environment in which they operate is complicated."

Corporations spent a lot of time and money bug-checking the front office software code on their mainframe computers for Y2K compliance before realizing an even bigger problem existed on the plant floor in automation controls and other systems running on embedded chips.

A massive catch-up effort is under way, at least in the United States. How big is the job? Experts can only estimate.

Tava Technologies, a Colorado software and consulting firm that specializes in assessment and repair of plant Y2K problems, says that in its experience at more than 400 sites, it has "yet to find a single site that did not require some degree of remediation (repairs)."

At a pharmaceutical firm with operations in 39 countries, for example, Tava found 4,457 embedded processors in the laboratory equipment and manufacturing facilities of one location.

Based on an inventory it conducted, 18% of the items were not Y2K compliant and 17% could cause a plant shutdown or affect production.

"The chance of these systems failing was 70% for the lab and 80% for manufacturing and facilities," says Bill Heerman of Tava's Denver office.

Tava estimated that it would take 39 weeks to inventory and analyze the firm's 125 plants at a cost of $11.5 million. The fix would take another 31 weeks and cost $54.8 million.

Is there time to fix it all?

"There is little reasonable prospect of timely correction of all Y2K exposures that exist," says a report from Manufacturers Alliance/MAPI Inc. "The effort to achieve compliance is one of damage mitigation."

The effects of a maverick embedded processor are unpredictable. It depends on where it exists in the chain and what is connected to it. Typically, these chips gather a lot of information to make limited decisions.

If a single temperature sensor tied to an embedded chip in a complex chain of measuring instruments used in manufacturing were to go haywire because of a Y2K problem, for example, the manufacturer could end up with a product with different ingredients -- if the product came out at all.

The stakes involved in locating and repairing these chips are huge, given the dependence of our systems on them. The size of the chore is every bit as large, given the proliferation of embedded chips in number and design.

"They are everywhere," says Steve Barnicki, an associate professor of electrical engineering and computer science at Milwaukee School of Engineering.


"They are cheaper and more trouble-free than mechanical systems," says Barnicki. As a result, they have played a pivotal role in powering productivity improvements everywhere since first introduced in the 1970s.

Fortunately, many (like the one in your portable CD player) couldn't care less about dates.

"There are embedded systems that don't have the faintest idea what year it is," Barnicki says.

So why not hunt down those that compute dates and fool them by turning back the year to play it safe, you ask?

The answer lies in the sheer number of chips and the independent way many have been programmed. These processors also work in tandem with chips and systems that would experience their own set of problems if a false date turned up.

The issue is made more difficult by ubiquitous quirks, such as chips that have the ability to disguise that they have date capabilities and escape detection until they fail. Or those that can have a delayed reaction.

"We encountered a controller on a process line recently that rolled over to Jan. 1, 2000, just fine," says Kurt Schmidt of Tava Technologies' Denver office.

"And it kept working just fine until it went to Jan. 32, then Jan. 33, Jan. 34 and so on all the way up to Jan. 54. Some of these systems won't show the date problems immediately."

Embedded chips come in a number of varieties from a host of manufacturers.

On the low end are ROM (read only memory) chips that contain basic instructions that cannot be changed. If these have a Y2K problem, they cannot be saved. The machine they are attached to may have to go as well, if a compatible substitute chip cannot be found.

Next are PROM (programmable read only memory) chips, which typically can be reprogrammed only once, according to Barnicki.

EPROM chips (erasable programmable) can be reprogrammed thousands of times after they are exposed to ultraviolet light. Finally, EEPROM (electrically erasable programmable) chips and similar Flash ROM chips have the potential to be reprogrammed tens of thousands of times.

Rockwell Automation, based in Milwaukee, is a leading maker of programmable logic controllers (which use embedded chips) to run factory automation configurations. The brand name is Allen-Bradley.

The company lists 17 different known year 2000 issues with its controllers on its Y2K Web site.

In addition, it outlines a procedure to test its controllers for other potential problem dates, such as Feb. 29, 2000 (leap year), Jan. 10, 2000 (1/10/2000 -- first seven character date) and Sept. 9, 1999 (the "9999" date field matches an end-of-data "9999" input signal in some computer programming codes).

Rockwell/Allen-Bradley's programmable logic controller issues are a microcosm of the complexity of the problem. They have:

Processors that won't roll over on their own and must manually be set to 2000.

Processors that roll over to a new century only if the power is on at the time of century change. (Jan. 1, 2000, falls on a Saturday in a holiday weekend when many plants would ordinarily be dark.)

Processors that won't roll over without new software or bug fixes.

Processors that are dependent on the compliance of the system they are connected to.

Processors that are totally dependent on systems that are not prepared for 2000 at all, such as 286 and 386 computers.

Many programmable logic controllers don't have clocks.

"You don't put a date in there unless you need it because it wastes power," Barnicki says. "Embedded processors are stripped down to fit the application."

Although a vast database of embedded chip compliance has been assembled by Tava Technologies and others, manufacturer assessments of the chips can only help so much.

"They can test all they want," says Automation Research Corp.'s Thompson, "but it's really up to the end user with the local application to test out the system. (The processor) might work in a vacuum.

"Once it's installed with custom add-ons and special report functions that have been locally written, there is no way for suppliers to help the users predict what will happen."

Says Tava's Schmidt, "There are going to be hiccups."

And some hiccups may occur in places that cause more than just a nuisance or harm to a negligent company.

The American Chemical Society has warned that chips automating control pumps and valves to prevent spills and other hazards may have problems that have not been addressed by small to medium-size firms.

"Even chemical companies that have actively addressed the Y2K problem may have underestimated its depth," says an article in the society's Chemical and Engineering News.

"Consultants hired by Occidental Chemical found 10 times more systems with potential Y2K problems than the company's own engineers found."

The new assessment of Y2K progress by larger American companies from Manufacturers Alliance/MAPI Inc., on the other hand, found cause for "cautious optimism" among big companies, given the level of awareness and the amount of effort.

Larger companies surveyed said they were on track to be compliant by 2000, while smaller firms were having trouble finding technical help that was affordable and competent.

"In the final analysis, the Y2K issue is an annoying, resource- intensive exercise in triage and damage mitigation," the report concludes. "Time is short and the stakes are high.

"The century rollover could be a nuisance or a calamity depending on the diligence with which Y2K correction is pursued."


-- Kevin (mixesmusic@worldnet.att.net), March 12, 1999.

I'm seeing a lot of generic references to certain types of embedded systems. Can someone post a list (or link to a list) of specific chips (make and model, please) that may need remediation/replacement? It would really help. Thanks.

-- Robert Neely (robert_neely@ncsu.edu), March 12, 1999.


What good what that serve anyone on this forum? The data is available to users through vendor information sources and usually requires customer registration. If you had any use for this data, you would have known that and never asked the question.

It's not a matter of secrecy, it's a matter of common sense. Manufacturers of these products have no time to be answering questions from people with idle curiosity.

This is serious business and vendor-user communications require specific knowledge of system, application and customization to make informed and correct recommendations.

Sorry if this sounded curt...too much coffee.

-- PNG (png@gol.com), March 12, 1999.

Nobody is arguing there is no problem. Andy, in a typical tactic, attempts to polarize the discussion to "no problem" vs "problem". I agree. Yes, there is a problem. The size and severity of the problem are what's in question.

I don't doubt any of the links provided. Kevin, a fairly good case could be made for .05% being statistically insignificant.

And no, I guess I haven't been blessed with the ability of some here to see the dark conspiracies behind everything related to Y2k. Probably one of the reasons I just "Don't Get It".

But, the recent findings do lead into another point. I'm not an Engineer (though my title once was "Software Engineer"; doesn't count). I have no direct experience with Embedded Systems, and have to rely on what little research I can do, and on the findings of others.

I've always had a problem with the application of metrics derived from software development projects to Y2k remediation projects. Even Capers Jones modifies the metrics, prior to applying them to Y2k.

It would seem to me that applying these metrics to embedded systems is doubly misleading. My guess is that the Inventory and Assessment phase of these projects will in reality be a much larger percentage of the project than originally thought. The problem seems to be finding these failure points, and not fixing or testing them. In addition, it seems that for the vast maority of these embedded system problems, the actual remediation is fairly straightforward; replace it. I realize that is not the case across the board, and that exceptions can be found, but it does appear to apply to a very high percentage.

-- Hoffmeister (hoff_meister@my-dejanews.com), March 13, 1999.

Oops, Italics off.

-- Hoffmeister (hoff_meister@my-dejanews.com), March 13, 1999.

Don't fall down dead, but here is a situation where I agree to a large degree with both Hoff (about the diff between Y2K remediation and embedded systems) and Flint (about the difficulty of specifying boundaries and understanding how/where this is being applied when we read about embedded systems).

At the risk of being considered "negative", I worked for a while for the Meta Group. While Gartner/Giga/Meta is filled with shrewd, smart people, there is barely an iota of original thought, research going on. Everything is geared to teasing, titillating and, above all else, keeping the client base (gov/Fortune 1000) happy. Understandably. They are great for trying to understand what the "establishment" is thinking/spinning; useless for understanding what is actually happening with Y2K. I wouldn't put stock in their figures whether they claimed there were 20% of embedded systems at risk or .00002%.

IMO, the true exposure with embedded systems is the degree of difference in operation between supposedly identically produced systems (this since it is transparently infeasible to assess, test, replace each system instance). If that proves insigificant, we can endure the likely scattered Bhopals, however tragic.

If it proves inconsistently significant (lots of problems and, by definition, very hard to find and replace), Y2K could be TEOTWAWKI even if software remediation reached 99%. I don't trouble my little head about it except to keep buying beans.

-- BigDog (BigDog@duffer.com), March 13, 1999.


What did you think of Meta Group?

Off and on, they've tried recruiting me. Just wondering what your experience with them was like.

-- Hoffmeister (hoff_meister@my-dejanews.com), March 13, 1999.

Big Dog:

I really don't think differences between apparently identical devices will prove to be a real operational problem. My major worry involves incomplete analysis of the ramifications of specific problems. I know that Dizzy Dean's career was cut short by arm trouble, which in turn was caused by a sore toe! (which caused him to change his delivery to something unnatural so as not to land on that toe so hard). Nobody thought to expect any connection there. I worry that too many embedded assessors will see the sore toe and decide it's no big deal. Trivial problem, won't affect anything important.

So I think there will be some immediate problems (Bhopals down to midnight explosions). But there will also be some long-latency problems as little things nobody thought to connect together percolate into big things later. I anticipate continuing downtime in production facilities for months, for reasons that would amaze Rube Goldberg.

-- Flint (flintc@mindspring.com), March 13, 1999.

Hoff --- Keeping in mind I obviously didn't fit into the culture!!

There is a kind of adrenaline buzz to the job from the constantly churning hunt for inside dope in one's specialty area, which is culled from a mix of research, gossip and sheer chutzpah. Ridiculous amounts of travel. Intense, sometimes vicious competition internally to be at the top of the heap (backbiting). Lots of disdain for the working press.

Lots of opportunity to make big bucks, if you're very good at it.

Email me if you want to discuss further.

-- BigDog (BigDog@duffer.com), March 13, 1999.

Robert: Per your request: Motorola SPS encourages its customers to review the Year 2000 information on semiconductor devices: [snip!] The Motorola semiconductor products referenced within this notification contain a real time clock function, (see attached list). This function is extensively described in the Motorola data sheet and specifications for these products. This function enables a customer to track time by seconds, minutes, days, day of week, months, and years. The register that records years contains a two-digit field. Thus, as we transition from 1999 to 2000 the register may roll from 99 to 00. The impact of this will depend on the programming you have performed and on your applications. In addition, due to the change in the implementation of Daylight Savings Time by the U.S. Congress, these clocks may no longer provide desired time changes at the newly prescribed dates to begin and end Daylight Saving Time. You should also review these clocks in terms of the recognition of leap years. [snip!] Scroll about 80% of the way down the page and you will find 70 affected devices listed from Motorola alone. They are just one of the many semiconductor manufacturers throughout the world.... Link: http://www.mot-sps.com/ y2k/mailing.html

-- Dan (DanTCC@Yahoo.com), March 13, 1999.

Re-post .. sorry for the formatting errors:
Per your request:
Motorola SPS encourages its customers to review the Year 2000 information on semiconductor devices:
The Motorola semiconductor products referenced within this notification contain a real time clock function, (see attached list). This function is extensively described in the Motorola data sheet and specifications for these products. This function enables a customer to track time by seconds, minutes, days, day of week, months, and years. The register that records years contains a two-digit field. Thus, as we transition from 1999 to 2000 the register may roll from 99 to 00. The impact of this will depend on the programming you have performed and on your applications. In addition, due to the change in the implementation of Daylight Savings Time by the U.S. Congress, these clocks may no longer provide desired time changes at the newly prescribed dates to begin and end Daylight Saving Time. You should also review these clocks in terms of the recognition of leap years.
Scroll about 80% of the way down the page and you will find 70 affected devices listed from Motorola alone. They are just one of the many semiconductor manufacturers throughout the world....
Link: http://www.mot-sps.com/ y2k/mailing.html

-- Dan (DanTCC@Yahoo.com), March 13, 1999.


From Motorola;

" There are a limited number of Motorola semiconductor products that have a two digit register to record years. These are primarily semiconductors that contain a real time clock function. We have issued an alert on these products and we urge you to review any of your applications that contain these products. "

******************** Can you say "Real Time Clock"? 90,000 products and the only ones with Y2K problems are 70 Real Time Clock semiconductor devices. Nothing else leaves Motorola with a Y2K problem. No mention of chips with a hidden clock feature that will cause havic during or after the rollover. Because there is no such thing as a Hidden clock function. The ONLY way a RTC keeps track of "time" is through the use of a crystal oscillator with a battery backup. A quartz crystal is ground to certain dementions which determines the frequency desired. The frequency can then be adjusted electronically or digitally. Basically you have a ground down piece of ROCK which gives you the clock function. Any embedded "chip" or "system" without this piece of ROCK will have NO Y2K failure problem OTHER than what may be caused by software. PERIOD.

So other than the known problem with RTC"s there are NO other problems with semiconductor devices leaving the factory. Now the ones that are programmed by the buyer can have problems and since a "program" is usually used to create the software to be embedded there should be documentation of what was written on them. Reading the documentation should be all that is needed to determine if there is a Y2K problem with them.

The hype over chips in your hairdryer, washing machine, tickle-me-elmo failing before, on, or after the century rollover is a scam. Ask why anyone would propagate such a scam? How about the big consulting firms who made so much money off of "remeadiating" mainframes for the fortune 500 etc. running out of "fresh fish" (ooops) customers to gouge? Easy, come up with a new, even lesser known area of concern. And blow the problem out of proportion so their services would be needed to find and fix the problems, but also to test everything to prove there arent problems. It was known that RTC's were had Y2K problems, and they are chips, so every chip in existance could have a problem!!! Suddenly the only way to make sure they had no problem was to check each individual chip! It doesn't matter if that type of chip has been checked and found to have no problems, (we are told) they could be different anyway!!! Suddenly the companies who were running out of customers for their mainframe work were suddenly "hardware" experts. Embedded chip and systems experts!

******************** Read the post below by Dave Hall posted in July 1998.

Reply-To: From: "Dave Hall" To: Subject: Re: Embedded Systems Date: Tue, 14 Jul 1998 21:18:31 -0500

John, Only one comment on the info you wrote - None of the people working the embedded chip problems has ever said that we must find, test, and fix some billions of microprocessors (chips).

What we are saying is that somewhere among the billions (pick your own number, I'm tired of arguing about a number that is really impossible to get and prove) there are maybe 10% (again, pick your own number, it's different in each industry sector and in each type of facility) that will be affected by the "00" transition or "00" dates.

Now, find those 10% and fix them. FIND THOSE 10% AND FIX THEM. How do you suppose we can find them? Well, by testing every one of the billions of chips out there.

If you have another way to PROVE to us that there is no risk of an unknown failure, then please provide it to the mailist. SNIP

Dave Hall My beliefs and opinions only, of course Year 2000 Infrastructure and Embedded Systems Engineering dhall@enteract.com

******************* Do any of you believe testing every one of the billions of chips is necessary? I hope not. Another falacy about "chips" is that the same kind of chip when made by different manufacturers can "work differently" causing possible Y2K problems. That is totally FALSE. It also shows that a person who would say such a thing has absolutly no idea about the subject! It also shows that the person is assuming hardware "chips" work the same way as software in that there are different ways to achieve the same result. That is pure unknowledgable rambling. As you in your own profession you know what is real and can tell when someone who has no experience comes in and makes false assumptions, those who know digital electronics know that that these and most of the newly proclaimed "experts" in en "embedded" anything don't have a clue to what they are talking about. You can go to websites of the "experts" and see that the same common statements reside there. They feed each others misconceptions.

So "news reports" that get their information from these "false" experts, should no more be believed that the people they recieved their information from. It's hard to find the truth in all of the speculation going around.

-- Cherri (sams@brigadoon.com), March 14, 1999.

Moderation questions? read the FAQ