Here's one (lengthy) account of why testing takes so long.greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread
For the past 6 months, I have been working as a system tester on a Y2K project for a large international company. This company is, in fact, the largest employer in the world. General Motors? No. The Pentagon? No. It is Manpower, the temporary staffing agency with thousands of offices worldwide and millions of employees. Their North American Headquarters happens to be in my hometown of Glendale, Wisconsin. Visit their website at www.manpower.com if you want more information about them. Let me share with you my experiences to illustrate why testing is so important and so time-consuming...
I was brought onto the Y2K team as a consultant through Manpower Technical in March of '98. The team at the time was 4 people, including myself. Manpower uses a variety of technologies, but most of their data is sent to the mainframes here in Wisconsin. As an aside, they are working on releasing Powerbase Release 3.0, which is a client-server program designed to replace the Mainframe. However, that project has been delayed, revised, scaled down, etc. The hope was to replace the Mainframe before 2000. That will not happen. It was begun in '95. The program is huge and riddled with bugs. Not little bugs, mind you, though there are many of those. Big cockroaches that cause system crashes, major bottlenecks, etc. I digress...
It took me 2 months just to get up to speed on their operations. Needless to say, it is quite complex. I could only learn the Mainframe side, and only a small part of it. They use hundreds of programs and have roughly 1 million lines of COBOL code. My responsibility initially was to create a full Y2K "testbed" of data using a variety of date scenarios and testing the full functionality of the system (touching every date-related program). This was a large task as well. Manpower had, prior to my arrival, contracted with Compuware to identify all of their date-related code, and implemented a windowing technique to fix it. 67 programs were modified, and the code was immediately put into the production environment without testing. This is common practice amongst mainframers. I digress again and I apologize. This is a long post already and I am trying to not be technical. Bear with me. I'll get to the point...
As we became ready to test, there was disagreement about methodology. How much field data should we use? I argued that we would need to submit at least 12 sets of data to simulate monthly data. I felt this was the only was to accurately test monthly, quarterly, and annual reports. That was overruled as "too much work". Instead, one day's data was chosen, and it was aged repeatedly. The plan was to test 4 dates only (again, to save time and effort): 12/20/99 (baseline), 12/31/99, 1/1/00, and 2/29/00. Far from comprehensive, but it would suffice. Let me digress one more time...
Manpower's original stated goal concerning Y2K was to have all "mission-critical" programs tested and compliant by 12/31/98, with non-mission-critical completed by 2nd quarter of '99. In July of this year, upper management, seeking a competitive advantage, changed all the rules. Now ALL applications would be tested and compliant by 12/31/98!! No leeway, no fudge factor, just DO it!! Several other consultants were brought in to work on projects which were originally scheduled for 1st quarter of '99. No project plans were in place. No test environments or databases had been created. All of these things are necessary and time-consuming for thorough, well-documented testing.
Anyhow, once testing began, there were numerous problems. When the system date was changed on the Mainframe (this is, by the way, not as easy as changing the date on a PC--it takes approximately an hour to do), immediately all passwords had expired. No one had even thought of that! That wasted a day. It was the first of many wasted days. As the testing effort continued, there were numerous abends (program terminations) for a variety of reasons. Because only a subset of programs was being tested, the JCL (Job Control Language) needed to be modified. There are errors inherant in this process. It was determined that our department would require approximately 300 tapes to store data files. There was argument about whose budget that would come from. Meanwhile, the operations team lost their most experienced operator (20+ years of experience), as he joined our Y2K testing effort. Lots of bad blood and lack of cooperation from operations ensued. The developers also viewed this whole project as a nuisance. They were certain the code worked (as most programmers are), and they had more pressing duties every day with production fixes or problems. The lead developer (30+ years experience) missed several days when his father died. Noone knew what he does, so everything ground to a halt. When we resumed, he discovered that our original baseline date needed to be changed because field offices use week-ending dates that are 2-3 weeks in the future on occasion, which meant that our baseline included dates into 2000, which we wanted to avoid. Etc., etc., etc. Bottom line? After 6 months, we are STILL in the EXACT SAME SPOT as when we started!!! OK, we've learned from our mistakes, but NOT ONE TEST has completed!!!
Keep in mind that 50% of all companies who expect to fix their code in time plan to do NO testing. Also keep in mind that many are leaving only a few months for testing.
Is Manpower unique? No. In fact, their Y2K project, complex as it may be, is peanuts compared to many other Fortune 1000 companies.
I apologize for the length of this post. I hope it has been informative.
-- Steve Hartsman (email@example.com), September 03, 1998
Views 'from the trenches' are indeed important. Many of us are trying to make one of the most significant decisions of our lives with very little reliable information to base those decisions on. Every data point helps... .
-- Lee P. Lapin (firstname.lastname@example.org), September 03, 1998.
Wrong summary - six months of system-level tests have been completed! Congratuations. Now you're beginning to get to a point of testing the program itself.
Hint: go to 3 remote sites, test their input there. Have at least one site be in a different time zone.
-- Robert A. Cook, P.E. (email@example.com), September 03, 1998.
I wish I didn't KNOW that your situation is commonplace. There is a Fortune 500 company in the Philly area that is APPARENTLY very organized and proceeding well. Unfortunately, I know the quality engineer on the project and feels the test plans for the various 20 million lines of code are woefully inadequate. What does he do? Rubberstamps them due to management pressure. Oh well, he IS a consultant after all...
-- Ray Givler (firstname.lastname@example.org), September 03, 1998.
Earlier, I mentioned that Manpower's Y2K project is peanuts compared to many other companies. Let me expand upon that statement...
I mentioned that only 67 of their programs were altered. Why so few? Because their fiscal year coincides with the calendar year, the current year is assumed in most cases. When dates are used, it is usually month and day only. Most companies (I don't have any idea what the percentages are) have fiscal years that do NOT coincide with the calendar year (they start in July or October, e.g.). This creates the problem of performing many more date calculations involving the year, and also leads to the "Joanne effect" in '99, which is referenced elsewhere on this bulletin board.
Again, bear in mind that this only simplifies their internal efforts. Compliance, in the full sense, means that they must have compliant suppliers and vendors. Although their goal of internal compliance by 12/31/98 is possible, full compliance will not be. For example, many of their banking transactions are handled via direct deposit. Invoices are sent via EDI. Both banks and EDI have said they won't be ready for testing (much less internally compliant) until the first quarter of '99. So, as much as claiming compliance might give them a "competitive edge", any claim of compliance will of course be called into question. This is, of course, true for every company in every industry.
-- Steve Hartsman (email@example.com), September 05, 1998.