When (Date) did the Y2K issue arise?

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

When exactly was the Y2K issue discovered, and by who, where, how?

-- Dan Salas (dan_salas@hotmail.com), November 14, 1999

Answers

I can only address my own awareness. I first became aware of the problem in Oct 1979 while attending my first class on software design principles. We all had a great laugh at the time and all was forgotten for most of the following 20 years.

-- Arnie Rimmer (Arnie_Rimmer@usa.net), November 14, 1999.

April 1985. My boss wanted to start converting all programs and make a rule that any new programs follow a 4 digit date. He left the company 1 year later due to family problems and we abandoned the effort. I left that company in 1988 so I dont know if they are y2k ready.

-- fortysomething (fortysomething@here.com), November 14, 1999.

Here is a clip of an article from the Washington Post. It is an interview with Bob Bemer, inventor of ASCII, (who also invented the ESC key, bless his heart.)

He describes the precise moment of no return: when the US Govt sanctioned the 2 digit practice

Snip...

In the late 1950s, Bemer helped write COBOL, the Esperanto of computer languages. It was designed to combine and universalize the various dialects of programming. It also was designed to open up the exploding field to the average person, allowing people who weren't mathematicians or engineers to communicate with machines and tell them what to do. COBOL's commands were in plain English. You could instruct a computer to MOVE, ADD, SEARCH or MULTIPLY, just like that.

It was a needed step, but it opened the field of programming, Bemer says, to "any jerk."

"I thought it would open up a tremendous source of energy," he says. "It did. But what we got was arson."

There was no licensing agency for programmers. No apprenticeship system. "Even in medieval times," Bemer notes dryly, "there were guilds." When he was an executive at IBM, he said, he sometimes hired people based on whether they could play chess.

There was nothing in COBOL requiring or even encouraging a two-digit year. It was up to the programmers. If they had been better trained, Bemer says, they might have known it was unwise. He knew.

He blames the programmers, but he blames their bosses more, for caving in to shortsighted client demands for cost-saving.

"What can I say?" he laughs. "We're a lousy profession." . . . .

The longer a program is used, the larger the database and supporting material that grow around it. If, say, a program records and cross-references the personnel records in the military, and if the program itself abbreviates years with two digits, then all stored data, all files, all paper questionnaires that servicemen fill out, will have two-digit years. The cost of changing this system goes way beyond the cost of merely changing the computer program.

It's like losing your wallet. Replacing the money is no sweat. Replacing your credit cards and ATM card and driver's license and business-travel receipts can be a living nightmare.

And so, even after computer memory became cheaper, and data storage became less cumbersome, there was still a powerful cost incentive to retain a two-digit year. Some famously prudent people programmed with a two-digit date, including Federal Reserve Chairman Alan Greenspan, who did it when he was an economics consultant in the 1960s. Greenspan sheepishly confessed his complicity to a congressional committee last year. He said he considered himself very clever at the time. . . .

A group did adopt a written standard for how to express dates in computers.

We are looking at it now.

It is a six-page document. It is so stultifying that it is virtually impossible to read. It is titled "Federal Information Processing Standards Publication 4: Specifications for Calendar Date." It is dated Nov. 1, 1968, and took effect on Jan. 1, 1970, precisely when Brooks says the lines on the graph crossed, precisely when a guiding hand might have helped.

On Page 3, a new federal standard for dates is promulgated. . . .

Federal Information Processing Standards Publication 4, Paragraph 4 and Subparagraph 4.1, is another of those statements. Here it is, in its entirety:

Calendar Date is represented by a numeric code of six consecutive positions that represent (from left to right, in high to low order sequence) the Year, the Month and the Day, as identified by the Gregorian Calendar. The first two positions represent the units and tens identification of the Year. For example, the Year 1914 is represented as 14, and the Year 1915 is represented as 15.

Ah.

The Y2K problem.

Set in stone.

By the United States government.

FIPS 4, as it was called, was limited in scope. It applied only to U.S. government computers, and only when they were communicating from agency to agency. Still, it was the first national computer date standard ever adopted, and it influenced others that followed. It would have affected any private business that wanted to communicate with government computers. It might have been a seed for change, had it mandated a four-digit year. . . . Link:

http://www.washingtonpost.com/wp-srv/WPlate/1999-07/18/

...Unsnip

-- semper paratus (always@ready.now), November 14, 1999.


I cannot give a date, but when I first took comp sci as a subject in 1971, our professor made a point of insisting that we NEVER used dates in calculations. It was alright to use any time base (such as seconds, hours, days, years) then to calculate a date from that, but NEVER we we permitted to work the other way and calculate a period of time from two dates. Although the potential errors were not called Y2K back then we were aware the massive errors could occur.

Incidentally, the languages we learnt back then were Fortran and PL1, Cobol was a business only language and no place in real computing.

Malcolm

-- Malcolm Taylor (taylorm@es.co.nz), November 15, 1999.


Dan,

There are no exact answers to your questions.

Many different people became aware, at different times and places, of various aspects of the collection of calendar-related issues to which we refer as "Y2k". No one "got" the whole issue at one time. Awareness spread and grew as people communicated their ideas. A few particular communications of these ideas, such as a famous 1985 ComputerWorld advertisement that warned of problems in IBM systems at the 1999->2000 rollover, can be exactly defined because they're on public record. But mostly, as in my case, programmers gradually realized over several years that there were multiple aspects of problems with computer handling of two-digit year abbreviations that could have serious consequences if not remediated.

Personally: in 1979 I warned my boss that some of our company's software would not correctly handle the transition from year "99" to year "00". I realized, and discussed briefly with co-workers, that this was only a specific example of a general problem that would plague date-handling software universally. But neither I nor many other people foresaw how ubiquitous computers would be in our lives by 1999.

-- No Spam Please (nos_pam_please@hotmail.com), November 15, 1999.



Dan,

There are no exact answers to your questions. Many different people became aware of various aspects of the collection of calendar-related computer problems that we call "Y2k" at different times and places in various ways. No one "got it" all at one time. As people communicated their ideas about this, awareness grew and spread.

A few particular communications, such as a famous 1985 ComputerWorld advertisement warning that IBM systems would fail to handle the 1999->2000 transition properly, can be exactly defined because they are on public record. But mostly it was a matter of individuals' stumbling upon certain aspects of the issue and realizing that it was more than an isolated glitch.

Personally: When I started programming in the 1960s it so happened that one of my first programs involved date calculations over many decades (for astronomy) and I realized that it needed to store years as whole numbers, not just two-digit abbreviations. But I didn't make the connection that two-digit year abbreviations would cause trouble in many types of software all over the place until 1979. In that year I discovered, and warned my boss, that some of our company software would fail to correctly handle the transition from year "99" to year "00". (His response was basically that he expected to retire before then.) My coworkers and I briefly discussed the issue of potential two-digit year number problems in date-handling software everywhere, but none of us foresaw how ubiquitous computers would be in everyday life by 1999.

-- No Spam Please (nos_pam_please@hotmail.com), November 15, 1999.


Whoops. Sorry for the double posting. There was a glitch on my end when I submitted the first posting. When I checked whether it had gone through, it did not appear in the thread. I not only tried reloading the page, but I even went to the trouble of disconnecting, then signing on again with a different browser! When the thread still did not show my first posting, I was sure it had been lost. Imagine my dismay when I saw both postings after entering the second one! Ack! :-(

-- No Spam Please (nos_pam_please@hotmail.com), November 15, 1999.

Moderation questions? read the FAQ