"Machines Need Designers", or "Why I love Brazil"

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

Dick Mills' latest "Lessons Learned" column Dick Mills This first Lessons Learned article is the most esoteric and profound of the series. That's why I put it first in line.

Since man first started using tools, our path has been to make more tools, more machines, and to automate more things. Lately, the rate of change has accelerated to a breathtaking pace mostly thanks to information technology (IT). IT has also allowed us to communicate much better and via communications to forge interdependencies that help make the world more efficient and our machines more autonomous.

Despite our marvelous machines, it is fair to say that the world is still people-dominated not machine-dominated. I say until now, because I interpret Y2K as the first early warning sign that we are approaching the critical point where that dominance begins to yield. As a result, we have mishandled some aspects of our information technology and the result was Y2K. If we don't reform, Y2K won't be the last time that we foolishly shoot ourselves in the foot on a grand scale. We have been dancing on the rim of the canyon without even knowing there was a canyon in the vicinity.

In recent decades, we realized that world population has reached the point where we could no longer pretend that the planet was infinitely big. Pollutants dumped down the drain don't disappear in an infinitely big ocean. We had to begin considering interrelationships and interdependencies that had been considered negligible. I suggest that we are coming to the same point with respect to technology and that Y2K is the clearest possible warning signal we're likely to get.

"So what?" you say. "What specifically can we do differently?" As modern society becomes gradually more like a well-oiled machine, we are going to have to start treating it more like a machine than a social system. Social systems evolve via politics whereas we design machines. I use the word machine loosely to include hardware, software and automation.

The essential difference between design and politics is that politics is called the art of compromise, whereas design hates compromise. When two or more people in a design committee disagree, one is most likely right (or most right) and the others are wrong. The job of the committee is not to make compromises or to be fair, but to find out what's right and what's wrong and to pass judgment. In design, people don't matter, nature matters. The natural world is the machine's operating environment.

Folklore says that the best design comes from the cohesive vision of a single person, and that design by committee is the worst kind. In reality, many engineering companies have learned to make very good designs using large teams while avoiding the designed by committee pitfall. Boeing, designing the 747 is an example that comes to mind. Nevertheless, design can't be democratic. It must evolve from an authoritative structure.

I interpret the Y2K crisis as a design problem. I am not referring to the design of date handling in programs, but rather to the design of interdependencies and risks and quality assurance in our modern automated world. The problem isn't an error, but rather that our top-level systems aren't designed at all. Instead, countless engineers, programmers and managers make individual decisions and the whole is the result of collective chaotic efforts. Nobody has the authority to make the technological decisions for enterprises, industries, nations or planets.

Imagine for example a team of food scientists deciding that improvements in health care are stretching the design limits of our ability to deal with senior people and therefore medical research must temporarily cease. Preposterous. We don't even have the mechanisms to allow technical disciplines look over the shoulder of their brethren.

I further suggest that Y2K is only the first of many global scale technological disasters we may cause if we don't reform our methods. Another pending disaster is the relative impermanence of information stored digitally, because we keep revising our hardware and software. True, librarians and archivists are working hard to find a solution, but they haven't found one yet and they don't have the design authority to impose it on everyone when they do find one. Meanwhile, more and more records become irretrievable every day because of technology. The 1980's and 1990's may turn out to be the worst recorded decades in history since the invention of writing. Readers, I'm sure, could suggest others.

Engineering design methods require near dictatorial powers. They are certainly not our preferred methods to run this world. Our world is still people-oriented. Free enterprise, self-interest, family, democracy, fear and politics are the mechanisms we use to organize and grow our world.

My conclusion, after considering the implications of Y2K, is that adapting to a machine-dominated world implies designing the future rather than evolving it. It is certainly doable but the implications are frightening. Design implies central authority and loss of freedom. Design is antidemocratic. We could call it the politicization of technology, or the dehumanization of society. Either way, that leads me to the core of this lesson:

When technology and automation grow to the point where the planet becomes a limiting environment, it takes on the characteristics of a large complex machine, and it needs to be handled as a machine or else failures like Y2K will recur. The most direct remedy is to force a merger of engineering and politics. The present inhabitants of this planet are likely to find that unacceptable. Therefore, the advance of technology will have to bend to people, not vice-versa.

I find that personally upsetting, because I'm libertarian in my political views, almost to the point of being anarchist. I abhor central control. I'm also a technophile and I've spent my life trying to use technology and automation to make the world a better place. It never occurred to me before Y2K that these philosophies conflict. In the future I'll have to be less quick on the trigger to brand anyone less enthusiastic about automation than I as a Luddite. I may become one of them myself.

So, what are our choices? I see four paths.

First, we can choose the "pedal-to-the-metal" advance of technology and make whatever changes that implies to our social systems.

Second, we can try to turn our backs on technology. I don't think we would succeed. One can never turn the clock back.

Third, we can ease up on the gas in one or more ways, without becoming technophobes. Demanding quality and caution, and making IT more routine may be sufficient. Upcoming articles in the Lessons Learned series will focus on a number of such points. If adopted, they would require us to slow down a little.

Fourth, we can ignore the warnings and continue with no planning and no coordination. In other words, bury our heads in the sand. This is the cynic's choice for a future. It results in a modern yet dysfunctional world.

I was going to suggest science fiction allegory to illustrate future visions corresponding to each of the four paths. I decided against it because doing so might weaken my case by overstating it, but there's one I can't resist. I am a long time fan of Terry Gilliam's Brazil. It was voted best film of 1985. When I first heard about Y2K, Brazil was the first image to come to mind. I'm convinced that it portrays (surrealistically) our future if we hold the present course (path four). If you haven't seen it, go out and rent it.

==================================================================== To this day, I consider Brazil one of my all-time favorite movies. I'm not sure who voted it "best film of 1985", however. It didn't win any Academy Awards to my knowledge. Until now, however, I hadn't considered the Y2k connection. Thanks for the great column, and the "Brazilian Connection". Rent it. You'll either love it or hate it; there seems to be little middle ground (much like Y2k, I guess).

-- Steve (hartsman@ticon.net), July 08, 1999


Font off.

-- Steve (hartsman@ticon.net), July 08, 1999.

Very cogent statement. Consider our rapidly evolving biotechnology sector and its eventual melding with the machine/digital age. Y2K is just the first of a series of tech earthquakes on the horizon. It may not even be very important in the larger scheme.

-- RD. ->H (drherr@erols.com), July 08, 1999.

I agree totally with Dick Mills' statement. All of us, from government leaders and CEOs on down, have tolerated bug-filled computer systems since their inception 50 years ago. We take it for granted that a new version of a product like Office 2000 will have many bugs and that all of the bugs will never be removed. We would never allow such defects in the airplanes that we fly on.

The demand in software has always been for the newest and snazziest. It has never been for the highest quality.

-- Mr. Adequate (mr@adequate.com), July 08, 1999.


Have you read "The Coming Anarchy" from the Atlantic Monthly several years ago? It fits right in with your post above.


-- Old Git (anon@spamproblems.com), July 08, 1999.

Actually, I found this piece very disappointing. It's sad that someone who is "libertarian in my political views, almost to the point of being anarchist" is promoting this central control/political control approach.

First, it's pointless, since there's no way to control all the tech industries in the U.S., let alone worldwide, in any significant fashion.

Second, it's pointless because the Y2K problem did not come about because of individual programmer sloppiness. It was a conscious design decision on many older systems, done to save space. The decision to continue to upgrade rather than replace the old software was also an explicit management action, not the result of sloppy programming.

Third, the feds had a chance in the 70's to make 4-digit years the standard, which would have gradually moved the industry away from this practice, and they did not do it -- due to cost objections from DoD. So what makes him think that a politically controlled system would make the correct decision more often than the current decentralized one?

Fourth, Y2K is somewhat unique. All systems have bugs, but Y2K will hit hard because it is widespread and simultaneous. The only comparable thing I can think of would be a complete crash of the Internet due to a virus.

He doesn't mention it, but there will probably also be calls for licensing programmers. This also won't help -- the Y2K bug was not a failure of technique, but of design. There are certainly bad programmers out there, but we all produce buggy code that has to be tested. And we all make assumptions that can prove to be invalid. Licensing programmers to prevent Y2K is like licensing authors -- you might prevent spelling errors from making it into print, but not bad books.

If Y2K is more than a bump in the road, look for more of this kind of talk, as people look for someone to blame. Personally, I don't know why they are blaming programmers anyway. In my experience, programmers are much better at what they do than managers are at management! And when you see the mismanagement of some of these large government projects, like FAA or IRS bids, I think that everyone involved should be fired.

He is right that for critical operations, we might have to raise our standards a bit, but this is a matter for individual companies or government organizations. Tying the software industry up in regulation will just insure that one of the few healthy industries left in the U.S. also moves overseas.

-- Michael Goodfellow (mgoodfel@best.com), July 08, 1999.

Moderation questions? read the FAQ