Bill Schoen and the 1984 article on Y2Kgreenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread
I've read before that an article in 1984 was one of the very first about Y2K, even before de Jager's 1993 article "Doomsday 2000". The article from 1984 was in Computerworld and was about a programmer named Bill Schoen.
I didn't know there was a link to this article anywhere. But I just found one a few minutes ago...
The problem you may not know you have
By Paul Gillin
News, Feb. 13, 1984
There is a bug in every Cobol program in your library. You probably don't know about it, and you almost certainly haven't had to worry about it yet. But when it shows up, it will hit your entire shop, reducing data entry procedures to a mush of error messages.
NOVI, Mich. When systems analyst William Schoen has presented DP managers with that pitch, he has understandably piqued their interest. But he has had trouble getting them to take his case seriously when they find out what the bug is.
The problem is the year 2000. The turn of the calendar presents a procedural issue that is acknowledged occasionally in trade conference jokes and DP shop banter but has attracted little serious attention in the industry.
The root of the issue is the industrywide standard of using two-digit date fields in Cobol programs. Error-checking procedures typically rely upon dates proceeding sequentially, with one year's date being greater than that of previous years.
However, programs will be thrown for a loop after the turn of the century when they have to cope with the two-digit date field "00" being greater than "99." Schoen ticked off the problems that will arise if the issue is not dealt with: "Program logic is going to malfunction all over the place. Sequential data processes are going to abend. A great many on-line modules won't accept the new dates because they include sequence checks. Most sorts won't work, and a great many literals won't work."
Perhaps understandably, the issue has received little serious attention from a DP community that is concerned with more immediate problems. But Schoen claims data processing should start paying attention now to the problem of making programs "year 2000-compatible" in order to avoid headaches when the time of reckoning draws near.
He rationalizes that most large firms maintain a library of thousands of Cobol programs, most of which have at least one date field. In addition, many of those programs have literals, sequence processes and error checks interspersed throughout, meaning that they will require modification in several places.
"You can't assume every date field has the word 'date' in it," Schoen said. "You also can't assume every field that looks like a date field is a date field. You're going to have to look at every program and figure it out."
For that reason, he said, the "black box" approach of simply running every program through a modification routine is impractical. Some date fields are bound to be missed. The only real solution is to write all new programs to be year 2000-compatible.
Not surprisingly, Schoen has developed a method to do that. The Charmar Correction is a package consisting of two Cobol subroutines that can be inserted into new programs to resolve the problem. The $995 purchase price also includes an analysis of the problem and a methodology to deal with it.
PROGRAMS MADE TO WORK
"They make programs work right, and they include directions to make sorts simple," he said.
Schoen has done some calculations to show why it makes sense to tackle the problem today. He figures it costs $300 to modify a program, and there is a minimum of 50 programs per programmer in the average corporate library. Assuming that half of those programs will need to be modified, that comes to a cost of $7,500 per programmer if the conversion is left until the last minute.
Based on scans of existing libraries, Schoen has decided that if firms begin implementing the changes now, by the year 2000 less than 2% of their programs will require modification. If they wait just six years to begin the process, the figure balloons to 25% to 30% of their library.
"Why should companies continue to write software that is not year 2000-compatible when it's just as easy to do it the right way now?" he reasons. Not many DP managers have bought his argument so far. He has been escorted out of buildings by security officers more than once, Schoen said.
-- Kevin (firstname.lastname@example.org), March 04, 1999
"The voice of one crying in the wilderness."
Winston Churchill in the mid 30s
Gen. William T. Sherman lamenting that the Civil War would produce rivers of blood, at a time when everyone else thought it would last no more than 90 days.
It sends a chill up my spine when a prophet is proven right. Schoen joins the list.
-- rick blaine (email@example.com), March 04, 1999.
I haven't saved the link, but some time ago I read that a group that included the original creator of COBOL petitioned the govt to pass legislation somewhere around 1970-1972 requiring that the two-digit date field be changed then. Apparently the petition got lost in the shuffle of a president undergoing impeachment.
Meanwhile, one of yesterday's articles, Mike Berman's Tech Talk indicates that there may be further problems with code thought to have been remediated.
-- Rachel Gibson (firstname.lastname@example.org), March 04, 1999.
You're probably talking about the article you can see at this link:
* * * * * * * * *
. . . Harry S. White Jr., a young data elements code specialist at the Defense Department, argued for using all four numbers. . . .
"The light went on for me in the early 1960s," he said. "I could see many applications by necessity needed a four-digit year. Personnel, medical records, retirement. When you do those with two digits, it will cause considerable problems. . . . was not too popular back then."
Two more numbers. Two extra keystrokes. If the Pentagon had agreed, one of history's strangest calamities could have been avoided.
Instead, at the Pentagon's urging, the first federal data processing standard for calendar years called for a two-digit year, leaving millions of computers without the ability to comprehend the turn of the century. . . .
The first article alerting computer programmers to the millennium bug - "Time and the Computer" in the February 1979 issue of Interface Age - was written by Dallas computer pioneer Bob Bemer. . . .
Edward Yardeni, chief economist with Deutsche Bank Securities in New York, also assails management.
"Where are the editors? Anyone can write software, but nobody edits it to conform to a grammar," he said. "We've put together a global network in the last 20 to 30 years without any adult supervision."
The Defense Department recognized the need for a computer grammar. It was the largest user of computers in the world, and many of the conventions it established became standard industry practice.
The Pentagon convened a Conference on Data Systems Languages in the late 1950s that produced COBOL, the Common Business Oriented Language.
Mr. Bemer, then with IBM, was one of the designers. His COBOL Picture Clause allowed programmers to use either four- or two-digit years for calendar dates.
Computers also needed a way to translate data into numbers as binary code. In 1960, Mr. Bemer developed the American Standard Code for Information Interchange.
ASCII and COBOL were two major standards that spread to computer users around the world, with the Pentagon leading the way. Less noticed was an effort to determine how programmers should enter data elements such as units of measure, time and dates.
A committee was formed to draft data standards for what is now called the American National Standards Institute.
Mr. White, Mr. Bemer, Washington management consultant Vico Henriques and 45 other computer specialists in government and industry tried to write a grammar the computer world would accept.
Mr. Bemer developed the committee's scope of work. Mr. Henriques was the secretary. Mr. White became chairman of a subcommittee on Representations of Data Elements.
At first, the millennium meltdown was not the big worry. But some committee members were thinking about two calendar anomalies. . . .
It took the committee 10 1/2 years to reach a consensus. But the federal government could not wait.
President Lyndon Johnson's Bureau of the Budget issued a directive in 1967 calling for data processing standards. Circular No. A-86 said the National Bureau of Standards, with input from affected agencies, would write the rules. Exemptions would have to get approval from the Budget Bureau.
At the time, the National Bureau of Standards did not have a strong data processing branch. The Defense Department had done the most work.
Mr. White remembers those days as a time when he was on the losing side of several Pentagon arguments.
"There were a number of meetings where we discussed the [calendar] standard" and whether four digits or two were the way to go, he said. "This was always a bone of contention."
Mr. White sent a risumi to the National Bureau of Standards and was hired as a senior systems analyst.
Even though he was now in a position to author a standard requiring the four-digit year, he was in no position to prevail over the Pentagon's wishes.
The federal standard for calendar dates was issued on Nov. 1, 1968, as Federal Information Processing Standards PUB 4. Mr. White won the debate over sequence. Federal programmers were to enter dates starting with the year, followed by the month and then the day. But the standard called for a two-digit year.
"The first two positions represent the units and tens identification of the Year. For example, the Year 1914 is represented as 14, and the Year 1915 is represented as 15," FIPS PUB 4 specified.
The standard took effect Jan. 1, 1970.
Mr. White now feels it was a calamitous mistake.
"I wish we could change things," Mr. White said. "But that was the political scene at the time. You couldn't go forth with a federal standard if you did not have approval or agreement from the Department of Defense."
Many industries - banks, securities, insurance - also objected to the four-digit standard, Mr. Henriques said. But if it had become the federal government's standard, any company doing business with Washington would have been compelled to follow suit.
"If you tell GM it's got to be four digits in all your dealings with the government, GM will shrug its shoulders and go along," Mr. Henriques said. . . .
Gary Fisher is a computer scientist at the National Institute of Standards and Technology (the current name for the old National Bureau of Standards) and heads efforts to coordinate ways to fix the Year 2000 problem.
He said two keystrokes may not seem like a lot today, but they made a difference in the 1960s.
"Punch cards only had 80 columns, only had room for 80 characters," he said. "Anywhere you could save room was appreciated. Two digits off a year made a big difference."
Mr. White did not give up. The federal standard recognized the need for numbers extending into the next century and allowed the use of a four-digit year.
He said he hoped to prevail in a rematch with the Pentagon.
Yet in the peculiar world of American standards, winning means reaching a consensus. The American National Standards Institute brings together as many major users in government and industry as it can in developing standards. The end product is more guidance than rule. The standards are voluntary.
Mr. White's committee, with himself and Bob Bemer in the lead, proposed a four-digit year.
The Department of Defense balked.
"The Defense Department was protesting out of fear that, if it became the government standard, it would be enforced," Mr. Henriques said. "Everybody was trying to crank things down so they could get more bang for their buck."
It seemed to Mr. White and Mr. Bemer that the computer world was racing ahead of its nominal masters in government and industry.
Mr. Bemer recruited 86 technical societies to appeal for a federal "Year of the Computer" proclamation.
"The computer industry was exploding like crazy," Mr. Bemer said. "We said, 'We don't know what we're doing. Let's pause and take a look.' I was going to use this as a platform to sell the four-digit year."
In 1970, President Nixon refused. Mr. Bemer said he never learned why. . . .
In July 1971, the standards subcommittee published ANSI X3.30-1971, a compromise standard on calendar years. Four digits were preferred, but programmers could drop two if they wished.
A similar standard was adopted in Geneva by the International Organization of Standardization. But the two-digit year was already the norm. . . .
Mr. White left the National Bureau of Standards in 1981 and put together a software company for churches. "We did birthdays as part of the system, so we used four-digit years," he said. "All those programs are Y2K compliant."
He became an administrator at the University of West Virginia, retiring in Morgantown in 1997. Today, at 64, he is West Virginia president of the Bible-distributing Gideons International.
He said he is still amazed at how loosely the information technology industry treats standards.
"We saw the problem coming, certainly in the 1980s, and we knew about it in the 1970s," he said. "You look at prescription drugs, and the hard rules there are for standards, and you realize something with teeth in it should have been done with this."
The National Institute of Standards and Technology revised the federal standard in 1988, warning of the millennium bug and saying a four-digit year "is highly recommended."
Last year, ANSI revised its standard, calling flat-out for a four- digit year. As usual, compliance is voluntary.
-- Kevin (email@example.com), March 04, 1999.
Thanks for the above but, no, I've spent very little time at GN's site. Have spent many hours elsewhere, and it was long ago and in a very different kind of site that I found the article. Sorry...wish I could remember.
-- Rachel Gibson (firstname.lastname@example.org), March 04, 1999.