So it was a computer space problem that created this mess?

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

COMPUTERS IN CRISIS-HOW TO AVERT THE COMING COMPUTER SYSTEMS COLLAPSE ( ISBN#089433-223-6 ) . And on the cover of the jacket of this book:

"Warning, logical and maintenance flaws have been discovered in the design of worldwide programming code and databases. These flaws will result in the collapse of the world's computer systems on or before 2000 AD. "How did it all begin? On November 1, 1968, the National Bureau of Standards issued a Federal Information Processing Standards Publication ( FIPS PUB#4 ) where it specified the use of 6 digit dates for all information exchange between Federal agencies." [end of jacket quote]

http://www.amazon.com/exec/obidos/ts/book-customer-reviews/0894332236/002-9156844-4945411

The question arises ..."how so many VERY BRIGHT and LOGICAL minded design analysts and programmers could overlook the very LOGICAL REALITY that year 2000 would come whether they liked it or not. Overlooking a future date just does not make sense for these ultra-IQ qualified programmers. How could they succumb to such stupidity?"

Makes one wonder...was it really a computer space problem or was it by design????

-- (mass@delusions.com), December 13, 1998

Answers

Yeah, we did it on purpose. Decided we needed to thin the herd a little. The primary target was those using unscrupulous advertising and marketing practices.

MVI

-- MVI (vtoc@aol.com), December 13, 1998.


Given "state of the art" computer systems of 1968, it was a lack of additional colunms on IBM punchcards available to show what century it was. "Heck, everybody knows what century it is." was the thought and so those two colunm were used for some other purpose.

In fact the two-digit rule was used to prevent people from using a one digit year field. No kidding, there were those who thought that bigger and better would replace their work in less than ten years. And certainly nobody could see any 1960's computer or program lasting into the eighties or nineties. Much less rolling into the twenty- first century.

Thus the two-digit mindset became accepted as gospel. While in the military doing a computer system acquisition in the late eighties we had to fight to change the "etched in stone" specifications requiring IBM cards for a system to be delivered in the 1990-91 timeframe.

Was two digits a conspiracy? NO! It's just coincidental cases of managerial short-sightedness and bureuacratic blundering that came together at the exact moment in time to start this chain of events. Unfortunately the results of this screw-up are going to prove very costly.

Check six! WW

-- wildweasel (vtmldm@epix.net), December 13, 1998.


Read the Vanity Fair Y2K article. Excellent research on "why."

Why? Short-term thinking, space saving, need for standards and DOD insistance.

Diane

-- Diane J. Squire (sacredspaces@yahoo.com), December 13, 1998.


What are we doing today that will have unforseen ramifications? We don't complain until TSHTF in any scenario. The air polution, water polution, exhausting natural resources, government spending, socialist security, paving over the best farmland, inflating the stock market, moral decline, etc. They were not any different in lack of foresight when the 6 digit date was established. People will ask the same questions when any of the above scenarios develop. To err is human.

-- HERBERT JOHNSON (HERB87@JUNO.COM), December 13, 1998.

"Indeed I tremble for my country when I reflect that GOD is just." Thomas Jefferson

-- flierdude (mkessler0101@sprynet.com), December 13, 1998.


The Year 2000 snuck up on us, just like Christmas.

-- Amy (leoneamy@aol.com), December 13, 1998.

...the book was published in 1984 and is out of print. It can be special ordered, but no promises. So much for a marketing scam.

The book reference was made to effect those who think this was all an innocent blunder to wake up and take a look at the other side of the coin.

In the mid to late 60's the government mandated standards by which businesses having to do commerce with the government would have to adhere to. It may not make some of you suspicious, but in addition to quite a few other references of this sort, it makes me very suspicious. You can think what you want and I will think what I want. We will know in a little over a year who is right and who is wrong.

-- (mass@delusions.com), December 13, 1998.


"It may not make some of you suspicious, but in addition to quite a few other references of this sort, it makes me very suspicious."

The authors of the 1984 book also have one published in 1996 (The Year 2000 Computing Crisis : A Millennium Date Conversion Plan) still in print, available in 24 hours from amazon.com for a discounted price of $31.96, which is $7.99 dollars off the original asking price. A discount of 20%!

"You can think what you want and I will think what I want. We will know in a little over a year who is right and who is wrong."

In a little over a year it won't matter who is right and who is wrong.

MVI

-- MVI (vtoc@aol.com), December 13, 1998.


To: mass@delusions.com

Oh, come on! As I've pointed out in another thread, Fellow 'Fivers', the use of two-digit abbreviations of year numbers started longer ago than computer programming.

I've seen a number of historical documents of both the early 1900s and earlier centuries in which all or almost all year numbers were abbreviated to the final two digits. Writing two-digit years is a natural time-and-space-saver that predated the invention of digital computers. Early programmers simply carried over an established standard practice into their programs.

Let me ask: Can you, or anyone you know, honestly say that you always write all four digits of the year whenever you handwrite the current date (not counting cases in which you're filling in a preformatted date field on a form)?

Please read my entry in the thread linked above to see the result of an experiment I ran for a few months.

>was it really a computer space problem or was it by design?

It was an unintended result of a combination of multiple aspects of human psychology, business practices, and the characteristics of man-machine interfaces.

-- No Spam Please (anon@ymous.com), December 13, 1998.


To: mass@delusions.com (continued)

Please also check the thread I can't believe that dd-mm-YY was a total oversight. I posted a rather long entry there about various points of view of programmers in the 1950s-1980s.

-- No Spam Please (anon@ymous.com), December 13, 1998.



I'm going to borrow Cory's words in a post on c.s.y2k

(fair use policy, etc,)

I'm looking for an IBM 3330-11 drive and a control unit. This thing is the size and weight of a small car tilted on its side.

25 years ago, these things were all over the place. Big corps had hundreds of them. Now, the drives are gone but the data (with two digit years) and software is still running.

The 3330-11 stored a huge 200 meg of data. It was 19 recording surfaces, 11 ferrite coated 19 inch diameter aluminum platters. The disk packs were about $1,000 and the drives were (possibly) fifty or a hundred grand each.

Kinda embarrassing compared to a $99 5.4 gig Maxtor. This might explain why we used to squeeze out the "19".

If anyone knows of a real live 3330-11 and control unit, I'm still looking for one.

I don't know why I get these assignments.

cory hamasaki 387 Days, 9,307 Hours. This should help in the explanation.

Chuck

-- Chuck a night driver (rienzoo@en.com), December 13, 1998.


Chuck, LOL. I just pulled this little tidbit up.

IBM 370-158 mainframe (circa 1971) consisted of a memory subsystem containing 1 million bytes of storage and 10 IBM model 3330 disk drives with an operating budget of over 150 million dollars to operate annually.

MVI

-- MVI (vtoc@aol.com), December 13, 1998.


"Indeed I tremble for my country when I reflect that God is just." Jefferson didn't stop there -- he went on to finish the sentence with and his mercy is not infinite. Questionable theology, but it sure makes the point.

-- Tom Carey (tomcarey@mindspring.com), December 14, 1998.

The first computer I ever programmed on had three 2Mb disk drives. Yup, that's three two-megabyte disk drives. (And 6-bit bytes, not 8- bit - basically a fourteen-inch-diameter metal non-floppy disk!)

I'm not actually an old-timer. I started in 1978, the computer in question dated back maybe six years earlier.

It was doing science, not commerce, but I very much doubt that anyone would have thought in that era that storing a lot of "19" fields that wouldn't change for 20-odd years was money well spent.

I became aware of Y2K in the ealy 90s, as did many programmers. I wasn't directly involved, but told many students and friends about the problem that would arrive in a good few years time. I don't know anyone who managed to persuade their then managements to do anything about it. I do know that in a few cases comments like "don't be silly, we'll deal with that nearer the time" were made by those in charge. Had the fixing started in 1990 or even 1995, there would be no major problem today.

Attributing blame is rarely productive, but the cause of the forthcoming Y2K mess is procrastination, and the procrastinators were managers, not programmers. Unfortunately the one thing that bad managers are usually good at is persuading the sh-one-t to land in someone else's lap.

-- Nigel Arnot (nra@maxwell.ph.kcl.ac.uk), December 14, 1998.


Actually, this has always been a cost issue. We've already discussed to death the substantial savings of time and space on the early, prohibitively expensive computers.

Unfortunately, the 2-digit-year protocol that developed was already viewed as too expensive to fix, even before the space those two digits used was considered no longer important. To repair the year fields, of course you needed to modify most of your programs, most of your data, and most of your communications protocols -- and so did everyone you communicated with!

This task was very expensive even 15 years ago, and very risky due to introduced bugs, and added no value at the time or for many years to come. Truly a lose-lose-lose proposition all around. And as a direct offshoot of Moore's law, the cost of fixing it continued to double every 18 months, as did the probable number of introduced bugs, and it STILL added nothing of immediate value.

If you'd decided to bite the bullet in 1990, you'd have needed to invent all the tools and techniques from scratch, and negotiate new protocols with everyone you communicated with. Pioneers get arrows in the back -- much better to let someone else blaze this trail and do the heavy lifting and invent the magic bullets on *their* nickel. Then you can tap into an existing, inexpensive supply of tools and expertise at your competitor's expense. Very logical, and certainly not *your* fault that everyone else figured exactly the same thing and everyone did nothing.

Besides, how hard can it be? It's just a couple digits, the problem is well understood, who can't find dates in the code anyway? All true except the extent of the problem wasn't understood. That's easy to understand because nobody examines a problem they already know all about, 01/01/00 is over a long weekend, we'll just fix it then, it's trivial. Right now, we have a 2-year backlog of IT requests, and they won't give us the budget to hire the people to keep up with these requests, and our computers are hopelessly obsolete, and all our programmers are working 50 hours a week as it is, and IT is always the first target for downsizing, hey, you want us to do WHAT?

-- Flint (flintc@mindspring.com), December 14, 1998.



Moderation questions? read the FAQ