Millennium Bugs Life

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

http://www.embedded.com/1999/9901/9901inc.htm

Millennium Bugs Life

by Lindsey Vereen What do Frankenstein and A Bugs Life have in common? They both depict fear of technology, a theme that abounds in literature because it plays on the widespread belief that machines could get out of hand and do us all in.

The latest iteration of this anxiety has settled on embedded systems, thanks to the millennium crisis. The San Francisco Chronicle quoted year 2000 watchdogs Rep. Stephen Horn (R-CA) and Rep. Tom Davis (R-VA) as saying that the millennium crisis would affect chips buried in VCRs, microwaves, fax machines, and other devices that arent programmed to recognize the year 2000. Suspicion has since spread to global positioning systems, telephone networks, elevators, traffic signals, and a host of other embedded systems.

Ten months ago in this space I asked for those of you who have encountered year 2000 (Y2K) problems in embedded systems youre developing to let me know. In all of the e-mails I received, no one cited a verifiable problem.

Assessing the seriousness of the Y2K problem is hindered by several barriers to its credibility. The first is that the Y2K problem plays on the fear of technology. People will leap at the chance to perceive a problem whether its real or not.

Second, the examples offered by Y2K experts are usually hypothetical and often ludicrous. Rep. Horn notwithstanding, I suspect my microwave oven will continue to reheat my coffee after the millennium.

Third, its difficult to prove a negative (try proving youre not a witch). Just because there has never been a verified sighting of a flying saucer doesnt prove that one wont be hovering over your house tonight.

Fourth, many of the people who have raised concerns about the millennium crisis have a vested interest in it. For example, The Financial Times quoted Gary Easterbrook as saying that the millennium could affect infusion systems and shut down, the implication being that patients would die. Easterbrook is the operations director of Millennium, a company established to deal with (read: profit from) Y2K problems.

Fifth, the mainstream press has had difficulty understanding and accurately presenting assessments of technologists whom they have interviewed. I know this from having been egregiously misquoted on this topic more than once. A reasoned position on the Year 2000 issue doesnt lend itself to sound bites.

And last is the thorny issue of Y2K compliance. Is it a problem if a product has not been certified to be Y2K compliant? Sometimes companies feel compelled to go through compliance certification even when compliance is patently unnecessary. I suspect that fear of litigation, rather than millennium bugs, has led companies to respond so vigorously to the compliance question.

The most credible assessment of the millennium crisis Ive heard came from Michael Dertouzos, Director of the MIT Laboratory for Computer Science. He said on 60 Minutes that the Y2K problem (in its entiretynot just for embedded systems) would be serious but not nearly the catastrophe that some people predict. Because technology does hold its share of dangers (5,900 people are seriously injured each year in escalator accidents), its wise to assess this issue thoughtfully without being swayed by doomsayers and falling prey to panic.

Lindsey Vereen lvereen@mfi.com

-- Cherri (sams@brigadoon.com), June 06, 1999

Answers

Lindsey Vereen is quick to point out a valuable point: No need to panic; but be concerned...she says she believes she will still be able to "heat her coffee in the microwave" after the New Year (speaking of embedded chips, not of any possible electric problem)...not without power she's not!!!

-- NSmith (nitnat3@aol.com), June 06, 1999.

NSmith, what evidence can you give that Lindsey will not have power?

-- Joe Six-Pack (Average@Joe.Blow), June 06, 1999.

"The most credible assessment of the millennium crisis Ive heard came from Michael Dertouzos, Director of the MIT Laboratory for Computer Science. He said on 60 Minutes that the Y2K problem (in its entiretynot just for embedded systems) would be serious but not nearly the catastrophe that some people predict."

I've heard this kind of assessment from academia before. You have got to understand that academia does not deal in reality. They study languages, operating systems, topologies, algorithims, cobnstructs, theories, etc. This is useful stuff for evolving the "state of the art" and educating future practioners in the field, but it isn't "real systems". Intellectually, they understand how "the bug" exists very well. Operationally (and viscerally), they typically have no experience with developing and deploying multi-site, multi-user mission-critical systems.

Worse, I have NEVER encountered an academic with ANY experience in systems maintenance. This field is just not interesting and exciting enough for anyone with the capacity to obtain a doctorate in computer science, mathematics, electrical engineering, or the like. (I don't like it either, but I've done it.)

If you want to understand how Y2K is evolving and predict the future, I recommend you read "The Logic of Failure" by Dietrich Dorner, the winner of Germany's highest prize for science. If you must find an academic with the background for predicting the future, you should look not in computer science or engineering, but in complexity and chaos theory. "Complexity" by M. Mitchell Waldrop is a good introduction. (Neither book, by the way, discusses Y2K directly, but both will evolve your view of the "way the world works".)

Finally, on panic, let's suppose that a year before Desert Storm, The government of Iraq and the mothers of infants in Iraq knew that there would be an infant formula shortage. (Really happened.) It would have been prudent for any mother to stock up a years worth of formula. It therefor would have been prudent for all of the mothers to stock up on formula. The government, of course would have considered this panic and discouraged it. Now look at what would have happened to the underlying supply systems. A year before Desert Storm, the increased demand would start to cerate "stock out" conditions in the stores. Stores would order more product. Producers would up production. When the war interrupted the supply system, there would have been more formula in the country to feed the babies till supply returned to normal.

Governments' understanding of what to do and fear of panic is not based on a situation like Y2K which is predictable in advance. It is based on events like hurricanes and earthquakes where panic starts immediately before or after an event which causes outages. Unfortunately, politicians don't understand technology and don't understand logistics, they understand how to control behavior. "When the only tool you have is a hammer, every problem looks like a nail." So, they are trying to control behavior, but this time, they are screwing up badly. I can't wait till the next election, so we can throw the "spinners" out. Of course they will be spinning about how it is someone's elses fault

Noel

-- noel goyette (ngoyette@csc.com), June 06, 1999.


Thank you Noel for your logic. Unfortunately it tends to escape too many. Your comment that we can throw the "spinners" out of office in 2000 only makes me pray we will be in a position to hold elections in 2000! Scary thought, eh?

-- Will continue (farming@home.com), June 06, 1999.

Here's a link to an informative article on embedded systems:

http://www.jsonline.com/bym/tech/0214chips.asp

"Problems lurk in more than just computers"

-- Kevin (link@librarian.edu), June 06, 1999.



A Bug's Life had nothing to do with fear of technology (an excellent flick BTW). In fact, and ironically for the context that the nitwit Lindsey Vereen used it in, it was based on the Ant and the Grasshopper parable. You remember that one right? Store food and survive, or cop the "no problem" attitude and perish.

-- a (a@a.a), June 06, 1999.

Let us assume she is at a university, what are her qualifications to discuss software testing, industrial controls, transporation, refineries, manufactoring, distribution, or any other industrial processes?

The sad part of it all is: the "academics" refuse to test, or aknowledge results of system-wide tests. They mentally "refuse" to look at links and unexpected failures "between" systems - here for example, (without testing anything, and while making an assumption she will have power, water, and heat, she says "fully expect" to use her microwave early next year.

Yes, she's right - the microwaves aren't the problem - aren't any part of the problem really. How wil she teach if there is no power, or no heat, or no lights, or no phones? Will she? Will she be paid? How? Are the university's systems ready? If not, why doesn't she "just" arrange for them to be set ahead and fully tested?

But we will hear no apology, no words, no solutions from her if problems do occur. Just crying in the dark, sitting cold by an empty faucet, huddling hungry as she waits for systems to recover.

-- Robert A. Cook, PE (Kennesaw, GA) (cook.r@csaatl.com), June 06, 1999.


http://www.greenspun.com/bboard/q-and-a-fetch-msg.tcl?msg_id=000Mce

"a Y99 Embedded Systems Failure - Employees Locked Out"

-- Linkmeister (link@librarian.edu), June 06, 1999.


Kevin, why are you using Linkmeister's e-mail address?

-- someone (you@cant.haveit), June 06, 1999.

"Be Prepared For Y2K Surprises"

http://www.govexec.com/tech/articles/0199mantech2.htm

[snip]

A single ship can have hundreds of microprocessors ("chips" to most of us) working unseen in systems that control functions such as ventilation, ballast, navigation, communications, detection of fires and other hazards, and so on. Operators of one cruise ship thought they had brought it into full Y2K compliance, Naccara says, but when they turned the ship's clocks forward to Jan. 1, 2000, in a test, the stateroom doors all locked automatically and stayed that way, because of an overlooked chip.

[snip]

-- Linkmeister (link@librarian.edu), June 06, 1999.



Noel, you say;

I've heard this kind of assessment from academia before. You have got to understand that academia does not deal in reality.

they typically have no experience with developing and deploying multi- site, multi-user mission-critical systems.

Worse, I have NEVER encountered an academic with ANY experience in systems maintenance. This field is just not interesting and exciting enough for anyone with the capacity to obtain a doctorate in computer science, mathematics, electrical engineering, or the like. (I don't like it either, but I've done it.)

You assume this came out of academia. You assumed wrong.

A Systematic Approach

by Lindsey Vereen

System is a term that we bandy about pretty loosely. My dictionary defines system as "a regularly interacting or interdependent group of items forming a unified whole." One characteristic of a system is that it's usually a subsystem of some larger system. In the context of electronics, we speak of software systems and hardware systems. Also systems on chips, board-level systems, and systems as diverse and complex as the Iridium project. Despite the difficulty of defining just exactly what a system is, electronics design is definitely moving toward a systems level approach.

The concept of system design is vague. Does it refer to the design of a chip? A board? A card cage? What was a box full of cards last year might be a chip next year. I offer MPEG-2 encoders as an example. Despite the shrinking size of electronics products, in many respects the design problem remains the same size, with obvious exceptions like packaging, thermal considerations, and -most significantly- the increase in software content.

What constitutes an electronics system has expanded over the years. The more complex the design problem, the more necessary it becomes to find ways to automate the process. The days of debugging a piece of electronics with a meter and a probe are long gone. In this month's "Break Points," Jack Ganssle bemoans the growing software debug problem that arises from the lack of visibility in smaller, faster products. As electronics systems become more complex, we need new ways of accelerating the development process. That means shrinking the design cycle, performing more rigorous analyses of design requirements, improving simulation and debug capabilities, and solving integration problems with no finger pointing.

Last month I mentioned the peculiar state of affairs in which embedded software tool companies are so much smaller than the hardware companies they support. This peculiarity is evident on the hardware side of design as well. Electronic design automation (EDA) companies, the outfits that develop the design tools for hardware, generate revenues that are only one percent of those of the semiconductor companies that couldn't produce products without their support.

Developing hardware and software design tools is an expensive process, and the market is limited. Tool companies, except for the very large and lucky ones, often wrestle with the specter of viability. EDA companies have recently begun to cast a collective eye toward embedded systems development including software. Interest in hardware/software co-design has been on the rise for the past couple of years, and this interest is manifesting itself in the emergence of exotic tools to facilitate the design of entire electronic systems. There is a long tradition in the EDA industry for the large companies to acquire small point-tool companies and thus broaden their product offerings. To gain a foothold in the software camp, EDA companies are looking covetously at embedded tool companies. Mentor acquired Microtec a year or so ago, and Viewlogic just recently purchased Eagle Design. More such acquisitions are likely to be just around the corner.

The fruits of such acquisitions will eventually be tool suites that integrate multiple facets of the design process and enhance communication among design team members. By merging hardware and software design support, tool companies will be able address a larger part of the design space in a more coherent way. And if there is strength in numbers, they may even wield a little more collective clout.

If you have to solve a system-level problem, you need a system-level solution.

Lindsey Vereen

Will continue (farming@home.com), June 06, 1999. Says: Thank you Noel for your logic. Unfortunately it tends to escape too many.

I would hope the bad logic excapes as many as possible

-- Robert A. Cook, PE (Kennesaw, GA) (cook.r@csaatl.com), June 06, 1999. Goes on to elaborate on the academia assumption:

Let us assume she is at a university, what are her qualifications to discuss software testing, industrial controls, transporation, refineries, manufactoring, distribution, or any other industrial processes?

Lets assume she is Editor-in-Chief, Of Embedded Systems Programming (http://www.embedded.com/) shall we? That would be a fact, not an assumption

http://www.embedded.com/gifs/em9804.gif

THEN what are her qualifications to discuss software testing, industrial controls, transporation, refineries, manufactoring, distribution, or any other industrial processes?

The only problem here is that you did the assuming.

Now see how assinine your uninformed ranting below make you look?

The sad part of it all is: the "academics" refuse to test, or aknowledge results of system-wide tests. They mentally "refuse" to look at links and unexpected failures "between" systems - here for example, (without testing anything, and while making an assumption she will have power, water, and heat, she says "fully expect" to use her microwave early next year.

Yes, she's right - the microwaves aren't the problem - aren't any part of the problem really. How wil she teach if there is no power, or no heat, or no lights, or no phones? Will she? Will she be paid? How? Are the university's systems ready? If not, why doesn't she "just" arrange for them to be set ahead and fully tested?

But we will hear no apology, no words, no solutions from her if problems do occur. Just crying in the dark, sitting cold by an empty faucet, huddling hungry as she waits for systems to recover. Why don't you ask her yourself? Lindsey Vereen lvereen@mfi.com -- Robert A. Cook, PE (Kennesaw, GA) (cook.r@csaatl.com), June 06, 1999.

Lets look at a little more of her work shall we?

Farmers and Cowpersons

by Lindsey Vereen

Out on the plains of the Oklahoma territory in the latter part of the nineteenth century, farmers and cowboys were not friends but were urged to be, according to Richard Rodgers and Oscar Hammerstein II in their musical Oklahoma. "Territory folks should stick together, territory folks should all be pals," were Mr. Hammersteinms words on the subject. That song came to mind recently as I was talking to a friend at a company where I once worked. During the time I worked there, the company was bringing in a raft of software engineers to develop a fairly complex embedded system. This company, which had been hardware to the bone, became a real textbook case of the barriers between software and hardware engineering. Apparently, the dichotomy between the two groups hasnmt gone away. Even after all these years, those farmers and cowboys still arenmt friends. A pity, considering how the line between hardware and software is blurring.

Herems what I mean. As Glenn Chagnot points out this month in "Using DSPs to Replace Dedicated Hardware," the algorithms you can implement using microcontrollers and DSPs can also be implemented in dedicated hardware by dedicated hardware engineers. Look also at the way complex digital logic is described in high-end systems nowadays. Traditional symbolic hardware schematics are no longer adequate to accommodate digital designs made up of many thousands of logic gates. Can you imagine the number of pages of schematics it would take to describe the Pentium chip? Over the past several years, engineers have been moving away from logic symbols and toward hardware description languages (HDLs), such as Verilog and VHDL, to describe circuitry. Verilog is characterized as being similar to C. VHDL is based on Ada, and was part of the governmentms very high-speed integrated circuit (VHSIC) program. VHSIC is the "V" in VHDL. These languages are used to describe and simulate complex circuits. Ideally, you can also synthesize your logic from these descriptions.

Companies are now springing up that offer software descriptions of hardware functions, rather than the chips themselves. Theymll sell you software models of everything from microcontrollers to DSPs to PCI cores. You can use these models for simulation and synthesis. Some vendors even offer you source code so you can modify them to fit your needs.

The same issues arise using HDLs as other programming languages: programming style. Increasing complexity. Bugs. The move toward hardware description languages has been a source of consternation in the hardware community because it represents a new way of working. Engineering students donmt learn how to use Verilog and VHDL in undergraduate courses. Engineers who have become adept at using these languages have developed a cottage industry of training and consulting firms. A lot of people say engineers will take this shift in design methodology in stride. Most engineers will become adept at using HDLs. The rest, they say, will become managers.

Software and hardware engineers arenmt going to be taking over each otherms jobs anytime soon. Hardware engineers typically have a better grasp of the nuances of hardware architecture, while software engineers write better code, even VHDL and Verilog code. But engineers represent an example of territory folk who ought to stick together. After all, as the line between software and hardware blurs, everybody is working in the same territory.



-- Cherri (sams@brigadoon.com), June 06, 1999.


Bold off.

-- Linkmeister (link@librarian.edu), June 06, 1999.

Might as well post this too. This is what the National Guard has to say about embeddeds:

http://www.ngb.dtic.mil/y2k/closer.htm

-- Linkmeister (link@librarian.edu), June 06, 1999.


Jusy another case of cut and paste from Dave Hall's "report" that even he now says has been proven to be a false estimate.

http://www.ngb.dtic.mil/y2k/closer.htm

Doing the Math on the Y2K Embedded Systems Problem: 10- to 25-billion embedded systems in existence An estimated 0.2 to 1 percent are not Y2K compliant 20- to 250-million embedded system failures, due to Y2K, could occur Small failures could have major impacts

-- Cherri (sams@brigadoon.com), June 06, 1999.


Moderation questions? read the FAQ