Embedded Chips' Clocks

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

Hypothetical question that maybe one of you engineering/chip guys or gals can answer for me:

How well do embeddeds keep time? I'm a PC guy and I don't blink an eye when I sit down at someone's PC in my office and there clock has drifted 5 or 10 minutes since my last visit. I haven't noticed any trend on them running slow or fast.

My hypothetical question is: Say in 1987 I installed a chip that will fail on CDC rollover. Is it possible that this chip currently thinks it is 1999-12-28 due to running a little slow for the past 13 years? Follow-up: If the chips don't keep perfect time, have you noticed any tendency (i.e. they usually run fast or slow)?

-- Think It (Through@Pollies.Duh), January 03, 2000

Answers

I am an embedded hardware/firmware engineer. As we know, the source for date information usually comes from a real time clock (RTC) which maintains the date even when the system is powered down. This is accomplished by running the RTC from a battery or alternate power source exactly as is done on your PC. These circuits derive their timebase from a crystal oscillator, much like your watch. These crystals have frequency tolerances measured in parts per million, so in theory, there should be very little time drift. However, an improperly designed oscillator circuit or maybe a low battery can cause problems. (My personal belief about PCs is that they cut every fraction of a cent they can because the margins are thin. This is why you get a machine that won't work properly on a hot, humid day or the clock runs at a strange rate.) There is no preference for faster or slower operation, at least as far as the inherent tolerance of the crytal is concerned.

Anyway, there is some validity to your question. But the counter point I would raise is if the date was important for some purpose in the system, don't you think somebody would have noticed it running so slow? There is the possibility that an internal, non-user function is using the date (e.g. BIOS) but that may not matter much either. Somewhere, a system may exhibit a problem due to what you have described, but I can't see this type of failure mode being any more than a statistical abnormality.

-- Chris Tisone (c_tisone@hotmail.com), January 03, 2000.


Great news, thanks!

-- Think It (Through@Pollies.Duh), January 03, 2000.

Follow up question: If a chip was made December 1, 1987, but not installed into a system with a power source until January 15, 1988, would it belive today's date to be November 18, 1999 (or thereabouts)???

-- Duke1983 (Duke1983@aol.com), January 03, 2000.

I'm afraid Chris got this one wrong. He is talking about the precision of the clock, not about the accuracy.

Precision refers to the constant duration of individual ticks -- every tick must last for precisely the same amount of time, within very close tolerances. Longer or quicker ticks (relative to all the others) are Very Bad, and the state machines inside the silicon will barf real fast if the precision is not extremely high.

Accuracy refers to whether the duration of all these identical ticks is the correct duration. The silicon is pretty damn indifferent to this -- your 200MHz CPU might really be running anywhere from 190 to 210 MHz (and can often be run up to 266MHz with proper heat sink, and many people do this) and it won't care. So long as the crystal is extremely precise, the accuracy is not very important.

And of course, cost savings on crystal oscillators take this into account, and these chips aren't any too accurate. A gainor loss of a minute or two a day is very common. So any system that needs to remain synchronized with the outside world needs periodic correction from some more accurate time source. For PCs, this is YOU, the user. But it must come from somewhere.

And where this synchronization isn't required, the original date was never set by anyone, and could be anything. Who cares?

-- Flint (flintc@mindspring.com), January 03, 2000.


Moderation questions? read the FAQ