A little bit of logic vs. 50 billion embedded controllers

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

The 50 billion embedded chips number has been cropping up again. I am looking for some hard numbers on how many control systems have actually been manufactured for industry (who cares about your TV remote vs. TEOTW?), but they have been elusive. Will post them when I find them.

Meantime - here is a bit of logical analysis to think about. A year has 365.25 (approx.) days. Therefore ten years have 3652.5 days, and twenty years have 7305 days. Looking back 20 years - early January 1979. Middle of the Carter recession. Companies aren't spending money on anything. Several years before the introduction of the IBM/PC. Almost all control systems are simple analog feedback controls. At this time, there is no signifigant embedded control problem. Dividing 50 billion by 7305 gives 6,844,627. So we would have to average nearly 7 million controls installed per day working 7/24. Since there would have been nearly zero per day in 1979, assuming a simple linear relationship gives a ramp up to twice that, or 13,689,254 controls per day. We aren't even making that many TV remotes per day - not by orders of magnitude. To put it another way, if we built TV sets that fast, every one of the 50 million households in the US would have to buy more than one new set per week to keep up.

I am not even going to discuss the numbers of engineers, technicians and factories required to produce such a massive number of control systems.

So squash that 50 billion. It is just a very, very bad number. I wonder who started that one?

-- Paul Davis (davisp1953@yahoo.com), January 26, 1999


This is truly "Paul-yanna Think" at its very best. Shoot, there probably are not all that many computer systems in use either. Heck, the whole y2k thing can probably be fixed in a month, no problemo.

Hey, I think that I have now leaned Pollyannaspeak. Its actually kind of fun. It makes any problem, no matter how severe, just melt away. If you don't like a particular number, just come up with some BS, and decrease it -- or increase it, whatever -- to what you want. No matter how well established the number is or who is using it, just change it and march on from there.

Lets hear it for Pollyannaspeak!

-- King of Spain (madrid@aol.com), January 26, 1999.

Paul --- Are there 100 billion "chips" involved? 75 billion? 50 billion? 5 billion?

Are there 1 billion embedded systems? 100 million? 10 million?

Are there 100,000 major enterprise systems that "must be fixed" to avoid depression-level scenarios or worse? 10,000? 1,000?

If there are 1,000, are we home free because the smart guys have identified them and they will be repaired? Or have we not even identified them as the critical-of-the-mission-critical because we don't know which they are, except on an ad hoc basis? We get lucky because someone addressed them?

If these 1,000 break, are we screwed? Or are we okay even if 10,000 or 1M break?

One of the legitimate scariest things about Y2K is that the lack of world industry standardization (within industries and across industry sectors), even on definitions of systems, makes rational Y2K assessment, remediation, testing and reintroduction of systems a nightmare. This is why compliance percentages are mainly a psychological PR exercise.

I'll concede that lots of uninformed people throw around big chip numbers as though big numbers are scary. Here is a big number: there are 100B galaxies (I haven't checked that,BTW). But so what?

The small numbers are equally or more scary until someone can tell me why I shouldn't be scared (I'm talking technically, not emotionally scared) because we're flying blind into a worldwide hurricane and it's the "worldwideness" that is the singularity of this compared to "historical hurricanes."

And, BTW, some pretty so-called knowledgeable dudes (idiots) like Gartner, GIGA and the like came up originally with the big, raw embedded system numbers. They also came up with some small numbers (300-600B to fix, including litigation estimates) that are now at the 1-2T figure. Who cares? That's another stupid estimate, if you ask me. It could be 20B, 300-600B, 1-2T ... depending upon "assumptions." Who cares?

I make only one assumption: there is no room for optimism about anything having to do with Y2K so far. SO FAR. It started as a fool's problem and has been handled like a fool's problem to date. Sure, it is possible (may it be so) that progress is actually much further along. But we have no way of knowing that rationally, see, none, so no grounds for optimism at all. None.

-- BigDog (BigDog@duffer.com), January 26, 1999.

Paul, I had this handy, from an estimate given last year from ITAA (Information Technology Association of America):

1998 May 18: Oil industry embedded chip example. At a
special briefing for Capitol Hill staffers set up by
ITAA and the House & Senate IT Congressional Working
Groups, Cara Corporation Embedded Systems Specialist
David C. Hall stated that there are over 40 billion
microprocessors worldwide, and anywhere from one to
ten percent may be impacted by the date change.
Embedded systems process information, monitor and
control system functions and are integrated into
everything from bank vaults to bottling plants. Hall
said that embedded system failures will cause one of
three outcomes: systems may i) shutdown, ii) produce
large, observable errors or iii) small, less
noticeable errors. He said 80 percent of the total
Y2K effort may be expended fixing automated control
and embedded systems. In demonstrating the ripple
effect of the problem, Hall described an oil company
that has determined the need to replace thousands of
chips controlling an oil dispensation system. The
chips, he said, do not fit on the existing
motherboards and new motherboards do not fit into
existing valves. As a result, the valves themselves
will have to be replaced, Hall said. Hall also
claimed that no plant or factory tested to date has
been found free of all Y2K related problems.

You can find this at:


(Since this estimate is only for 40 billion, rather than 50 billion, will you accept my I.O.U. for 10 billion???)

-- Jack (jsprat@eld.net), January 26, 1999.


It's easy to understand the confusion. If a person isn't involved in the design and engineering end of embedded controller design (as I have been), one might assume (as you did) that each chip is its own "system".

Let's look at a completely different and unrelated object for a moment .. a typical automobile. How many fasteners are used to assemble the car? Each nut, bolt, and screw isn't its own "system", it's a part of a whole. The same applies to embedded controllers.

Embedded controllers are generally entire little computer systems in of themselves, and frequently are manufactured as a single unit. They may contain from just a handful of chips (20-50 or so) .. up to come very complex models with 10,000 or more "chips" within.

If just one component in the date-chain works with two-digit years, it doesn't matter how many of the rest are four-digit-year capable. It's the proverbial "weakest link" theory at work.

Let's briefly apply this analogy to a desktop PC. Inside the computer, you have a battery operated day-and-date clock. Yet another part of the computer is called the "BIOS" ("Basic Input Output System" for non-techies). If the day-date clock generates a four-digit year, but the BIOS only works with two digits, it doesn't matter that the clock itself generates four digits, as only two are passed on to the rest of the computer by the BIOS. Conversely, if the BIOS section works with four-digit years, and the clock generates only two digits, the same still applies.

Moving up from the day-date clock and BIOS, the next layer is the software. If in our over-simplified example both the day-date clock and BIOS work with four-digit years and the programmer wrote the software/firmware to use only two digits, we're back to square one again .. so to speak.

The fact is .. the computer isn't just one chip or device. It's many. The same applies to embedded controllers. As a part of my educational effort (I conduct Y2K seminars for public officials as well as the general public), I've managed to procure two embedded controllers (neither of which were capable of functioning properly in the year 2000). One is small enough that it barely covers the palm of one hand. The other is over a foot square. The smaller unit has fewer than 30 chips. The larger unit has several hundred. Aside from RAM and a few other subsections, a number of other aspects of the embedded controller are suspect with regard to dealing with two-digit years.

For instance, two-digit years can be accurately represented as a "short integer" (a whole number without a fraction for the non-math people). A short integer has a valid range from zero to 256 (and no negative numbers allowed). This value can be represented in just eight bits (ones and zeros for the non-computer types amongst us). A "long integer" could have been used to represent a four-digit year, but would have required 16 bits (values would then range from zero to 65,535). The problem was (and is), that this is twice the amount of memory for each number. When embedded systems were first designed, RAM was *VERY* expensive (I can recall a time when 1K of memory was over $500.00). In any event, as controllers became more sophisticated, and the cost of memory (and other chips) came down, the fact remained that the software/firmware we'd written "wasn't broke" .. so to speak .. so there was no need nor incentive to make changes. Time, day, and date routines were often copied from an older controller to a newer one to save time. Why? We didn't wish to reinvent the wheel when we already had the routine written? Subsequently, even in the most modern of embedded controllers, we would often find proven circuit designs AND software that were copies of their 20+ year-old predecessors. After all .. if it "wasn't broke"... there was no need to "fix" it. Since we often ran on a low-bid basis, as well as under some serious time constraints ("hurry up .. they can't get started on the next part of the system until you get this part finished!") .. we did whatever was deemed necessary to keep the cost down and finish sooner.

Ultimately, in testing, it was discovered that it was more practical to replace these two units rather than try to discover which of the particular chips and/or firmware in each was the two-digit bottleneck (remember .. it just takes one two-digit bottleneck to mess up the works .. and it's quite probable that more than one device and/or firmware segment in each was/were limited to two digits).

To add insult to injury, controllers were often "potted" .. covered with a special dipped-plastic layer to prevent humidity and/or corrosion problems. This is the case with the small embedded controller I use in my seminars (it was used in a date-sensitive temperature sensing application in a boiler). To find the affected device(s) in the little controller alone would have required considerable time, thus it was easier to replace after it was determined to be non-compliant. Finding a compatible replacement was an entirely different issue. The manufacturer of the old controller is no longer in business. Once a replacement was located, the entire boiler had to be shut down to effect the change. In this application, the cool-down required 48 hours. The new controller was then installed and tested .. and the boiler was put back on line. Total downtime for this one device: five working days.

Anyway, between the two controllers, there are roughly 380 chips overall. Multiply the average number of chips in each unit by the number of embedded controllers, and it's easy to see how the number of potentially affected chips is 50 billion.

Hope this helps.


-- Dan (DanTCC@Yahoo.com), January 26, 1999.

Paul, no offense, but I think you've expended an extraordinary amount of effort on an argument that doesn't mean much at all. It's sort of like my attitude about money: beyond a low-seven figures salary, I wouldn't care how much I made because it would be more than I could reasonably spend. (I know, high standards.) Beyond a certain point, it's all "a frickin' lot."


-- Scott Johnson (scojo@yahoo.com), January 26, 1999.

Dan - I understand what you are saying very well. My problem lies in the people who are confusing systems with controllers, microprocessors with SSI chips and various other elements of confusion which are making the embedded controller problem sound about 1000 times worse than it actually is. Just looking at your examples proves that - if one system contains thousands of chips - then when you know whether or not that system does dating answers the question for those thousands of chips! It is not a one at a time process except for orphan or one of a kind devices. And most of the large industrial control mfr's are still around and quite willing to tell you whether or not a certain PLC or control panel has issues. And no allowance is being made for old equipment that has been replaced (another 10,000 chips on the scrap heap) old factories or mines closing - vast quantities of specialized control equipment that is counted in these silly numbers is mouldering in a junkyard somewhere. I have a Texas Nuclear control panel at home that I bought to take transistors off of to use in electronics projects - don't know and don't care if the old thing was Y2K aware or not - and it does have a RTC chip on board. My point is that the size of the problem has been blown out of all proportion to the actuality.

-- Paul Davis (davisp1953@yahoo.com), January 26, 1999.

You may be right Scott. I know the embedded control problem is much smaller than most think - but I don't know how to prove it to you. It is nasty, and it will cause plenty of problems - but if I believed in the implications of the silly numbers being tossed around I would think nothing whatsoever would work on 1/1/2000. And I know that is not what is going to happen - too many outfits have already had successful 'time machine' tests to believe that.

-- Paul Davis (davisp1953@yahoo.com), January 26, 1999.


This is just so frustrating that I'm about to go into Milne mode and start calling you names! We have cited how many -- four, five, six -- sources now citing similar figures about the number of microprocessors (not just chips, all you dopes out there, MICROPROCESSORS!!!!!!!!) shipped since their invention. In the source I cited, the figures came from INTERVIEWS WITH THE MANUFACTURERS!!!

Sorry to shout. I have appreciated Paul's contributions in this forum. But this is just ridiculous. Not only does he pick on a completely meaningless statistic (as BigDog pointed out above) but HE'S COMPLETELY WRONG.

Wake up and smell the coffee, Paul.

-- Franklin Journier (ready4y2k@yahoo.com), January 26, 1999.

"The 50 billion embedded chips number has been cropping up again. I am looking for some hard numbers on how many control systems have actually been manufactured for industry."

Paul, several sources of experts have said around 25 millions of systems. Is this the number that you are still not happy with? Why did you start another thread rehashing that same goop? There's already one where you were very active in, with the same 50-billion-chip/actual-number-of-systems question.

-- Chris (catsy@pond.com), January 26, 1999.

You want to know why I go ballistic? Because of assholes like Davis. A completely specious goofball.

At best he is a disigenuous troll. There is not even and modicum of merit to his post.

In a few more months the hammer will come down and because of people like Davis, other people will die.

This is NOT a game. This is life and death.

There is NOT one single scrap od evidence that anywhere near enough is being done. Now eleven months till the rollover. NOT ONE fortune 1000 company done. Not one percent of any industry done.

Not here and not anyplace else in the world. And Davis blathers on like the ASS that he is. I won't mince words. It is too late for that. People are going to die because they did not prepare and davis and flint help dissuade them.

The day that either of them propose evidence to the contrary, then I will show them some respect. Untill then they have not only NOT earned any, they are deserving ONLY of scorn ridicule and derision.

-- Paul Milne (feinfo@halifax.com), January 26, 1999.

That's not an entirely true statement there Paul. I know of 1 Fortune 500 company that's completed their Y2K work......successfully too. But for some reason, I've got a funny feeling you're going to have a very hard time with that.

Logic tells me that we can not possibly be the only one that is ready. There are hundreds of comanies that are ready out there. Really and truly Paul.


-- Deano (deano@luvthebeach.com), January 26, 1999.

Now this *is* getting ridiculous. There are not 50 billion microprocessors and/or controllers out there. You have not posted any such evidence. All the evidence posted says 50 billion chips, *not microprocessors*. And the quote posted by Jack above isn't even from the source. You can't even start to analyze the problem if you don't understand the terms.

-- you're idiots (.@...), January 26, 1999.

Billions and billions.

Where is Carl Sagan when we really need him?

-- (jor-el@krypton.com), January 27, 1999.

Paul Davis started this thread because he "skimmed"(his word) the report called "The smoking guns of Y2k" wherein the author stated there are approx. 50 billion chips wordwide and then he went on to say approx. 1% may be date sensitive which would give us 1-5 million? problem chips. I'm too lazy to do the math, but it doesn't matter. My understanding is that these chips could be in hard to reach places, and cause breakdown in some mission critical systems , ie, phones, utilities, banking, etc. If and/or when they malfunction, it seems that it wont matter if 10 hard to find chips did it or 10,000 easy to see chips did it.

-- King of Free Estimates (Won@theracetrack.once), January 28, 1999.

Moderation questions? read the FAQ