PRODIGY,AOL,COMPUSERVE,ETC Y2K THREATENED? COMPLIANT?

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

Sorry if this has already been answered elsewhere,but I didnt find it. Does anyone know if power and phones survived, would the internet service providers be up and running? Any word on how Y2K affects them and how they are progressing if it does? Also, will non compliant customer computers endanger their service or websites that the non compliant computer visits?

-- Ann Fisher (zyax55b@prodigy.com), December 26, 1998

Answers

Not a simple question to answer. I do not know the Y2K compliance status of AOL, Compuserve, et al. However, we are all in the same boat. That is, it is not enough for a company to be internally compliant (and the definition that I'm choosing to use here simply means that enough of their internal systems are functional to provide continuity of service -- all other factors being ignored). Their 'mission critical' vendors of products and services must also be compliant (same definition).

Companies like AOL (like most companies) are 100% reliant upon their vendors. Imagine a scenario where electricty and telecommunications stay on but banking has serious disruptions. They process thousands of credit card transactions daily. If this system fails, how long can AOL continue to operate? I would imagine they could get by for at least a few days. Maybe alternate sources of banking services will be available. Maybe not. Maybe they could even survive an complete operational outage for a few days. Customers would probably not leave them in droves if they could reasonably be expected to be back online within a week or two. On the other hand, if AOL were to experience a 3 month outage while Joe's ISP down the street was humming along fine, well, I think you know what this would mean.

Imagine another scenario where electricity stays on everywhere EXCEPT for a where their major operations headquarters is located (Virginia say). If this were an isolated outage of short duration, you're looking at a minor inconvenience. If the outage were prolonged, then other arangements would need to made. Moving a major operations center is a huge job even during 'normal' times. It might be impossible during Y2K. If a prolonged outage occurs in an isolated region (say the DC and Virginia area), EVRYONE will be attemping to move their operations (if only temporarily) to a more viable area.

Another scenario can be imagined where electricity, telecom and banking all continue to function but where other problems have occurred resulting in civil unrest making it dangerous for employees to come to work. No employees, no service.

There are just too many unknowns here to make prognostications on a specific company. Certainly the 'iron triangle' (electricity, telecom, banking) is absolutely essential. Lose any of the three for a prolonged period and all bets are off. But there are numerous other problems which could occur short of that, either regionally or locally which, while not catastrophic to the general population, could indeed be fatal to businesses in that location.

If one telecom provider in my area fails and another can provide service, and if the failed one cannot seize their window of opportunity to recover, the business whose service didn't fail will reap a harvest of new customers. Of course, if thousands of customers in single location suddenly want to switch providers, it could still take months to get your phone service restored. I think the same thing holds true for the large ISPs as well.

The future of any one single company which is itself reasonably compliant (internally) will be determined by the compliancy of its vendors, the regional problems which actually occur, the size of the company, the number of outside vendors it uses, the availability of alternate source for products and services, the market it serves, the reaction of its customer base, it own contingency planning, the reaction of the general population in its theatre of operation, the reaction of the government, how well it is organized, how well it can execute contingency plans within its window of opportunity, etc, etc, etc, It is an equation with 10,000 variables and 1 million unknowns. This is the ugly, intertwined nature of the beast.

All other things being equal (which they are not) I think the larger, more technologically dependent companies will see the most trouble here. It's an economy of scale thing. If my business consists of 3 people, 6 mission critical suppliers and I can move one county to the north and continue operating. I'll have a much better chance of doing so quickly than will a company with 100,000 employees and 5,000 suppliers. The problem here is that many small and medium sized businesses have adopted (either explicitly or implicitly) a 'fix on failure' approach to Y2K. Some larger businesses have been addressing the issue for longer (though not nearly long enough). Still, it's much, much easier to turn a small bass boat quickly than an aircraft carrier. Ask IBM.

-- Arnie Rimmer (Arnie_Rimmer@usa.net), December 26, 1998.


# # # 19981226

Dear, Ann Fisher:

My last Y2K project ( 3-months duration ) involved ~$300,000,000 revenue dependent upon CompuServe connectivity. My client declined ( IOW: brushed off [ Troll Maria ] ) stern warnings about CompuServe not responding adequately to their corporate Y2K-readiness and electric( Detroit Edison-DTE )/telephone utility SEC 10-Q's.

I was ( ethically ) compelled to include these documents and my opinions ( captured in my notes from meetings and telephone conversations ) for the Y2K-archives of this Fortune 500 company. One can only do so much.

What more does one have to say ( document ) in order to demonstrate corporate "don't want to hear it" shortsightedness? It is anathema to spoil the "smiley face," "rosey outlook," and DWGI outlook for the fortunes of the Fortune 500 stock values.

I can only *sigh* at the ineptitude by management to perceive ( via denial ) real Y2K-risks to their business and employees.

Regards, Bob Mangus # # #

-- Robert Mangus (rmangus@mail.netquest.com), December 26, 1998.


Think about it, some ISP's provide lousy service, dial-ups as is. What happens if the phones go down in certain areas, no postal service, intermittent power etc? Any ISP's that stay up will be swamped by users IMHO. Ergo lock-up, things grind to a halt. Remember AOL 2 years ago??? Try phoning abroad today, Xmas day??? No chance.

-- Andy (2000EOD@prodigy.net), December 26, 1998.

Folks, you might also consider the fact that if communications become at all dicey the fedgov has the capacity to demand priority access (i.e. grab control of) any and all communications resources it needs...even if power and communications stay up in some areas I wouldn't expect the general public to continue to have unlimited access to the net...especially when one considers how far behind both DOD and State are on their remediation projects...

Arlin

-- Arlin H. Adams (ahadams@ix.netcom.com), December 26, 1998.


(Warning -- Long Post)

This is a long and excellent article that appeared in a Silicon Valley newspaper recently explaining how the internet works. Newsmedia links gone now. -- Diane

http://www.mercurycenter.com/premium/front/docs/datariver129.htm

Published Sunday, November 29, 1998, in the San Jose Mercury News

Journey to a Bay Area center of the Internet reveals cooperation and rivalry

THE man from the phone company has all the answers, the official line on everything from dial tones to digital subscriber lines. But on this day he comes up empty.

``I think I know where it is. I'll have to make some inquiries.'' A pause. ``You want to go there?'' Yes. ``I don't know. Are we allowed to show you what's behind the curtain?''

Behind the curtain. It's a little joke.

Where the writer wants to go, there is no yellow brick road, no Emerald City. There is, however, a wizard. Somebody, somewhere, is pulling the wires at the center of the Internet.

For all but a handful of the 70 million people who use it, the Net is a conceptual space -- that ``other'' place that lies beyond the screen and somewhere down the wire. Because it is oblique, dispersed and almost unfathomably complex, it's known only through metaphor -- data railroads and information superhighways, big pipes and plumbing. Even the wizards -- the engineers and technicians who make the network run -- fall back on metaphor. Where does the data really go? When they draw it on a white board, after a certain point the data just disappears ``into the cloud.''

But the Internet is not an ethereal construct. It has a tangible body that exists anywhere electronic devices talk to each other over public networks in a digital language called TCP/IP. It is as close as your home computer, your modem and perhaps even your cell phone. And as distant as a satellite spinning 22,000 miles above the Earth.

As for that hidden middle space, it's not really all that hidden. To see the physical core of the Internet, stand at the corner of University Avenue and Bryant Street in downtown Palo Alto. It's right there: between the drugstore and the bakery. There is no sign to identify the three-story granite-and-sandstone building as ``The Middle of the Internet.'' There's no sign at all for 529 Bryant St., just a set of 8-inch-tall brass letters that spell out ``Digital.''

A Bay Area access point

This is Compaq Computer's Palo Alto Internet Exchange (PAIX), one of approximately 75 network access points around the globe where the great data rivers of the Net converge. Three of the 12 major network access points, also known as public Internet exchanges, in the United States are in the Bay Area.

In addition to Compaq's Palo Alto facility (Compaq acquired Digital Equipment in June), Silicon Valley's other hubs are the Pacific Bell Network Access Point (with equipment spread over six cities) and MCI WorldCom's Metropolitan Area Ethernet installation in downtown San Jose, known as MAE West.

Any similarities between the Palo Alto Internet Exchange and an ordinary office building end at the first-floor lobby, where visitors and clients who have business at the exchange sign in, receive ID badges and wait for their escorts. Everyone, without exception, is accompanied by a Compaq employee at all times. Laura Hendriksen, general manager of the exchange, doesn't usually do escort duty. But on this day, she's there to usher a writer through one, two, three, four layers of security doors and, finally, into the heart of the Net, one level below the street.

Like many people in the networking business, Hendriksen is not especially given to poeticism. Hers is a world of pipes and peers, feeds and speeds. Yet as she stands before the last locked door that leads to the equipment cages at the center of the exchange, she talks of the historic resonance the 70-year-old building holds for her.

``Most of the people who work here know that this was a telephone company central office,'' she says. ``It was the center of things back then and that's the way it is again. It's as if we've brought the building back to its historical roots. We think that's cool. And the customers do, too.''

Cool is precisely what the exchange's designers had in mind when it was built in 1996. Other Internet exchanges were dank, depressing vaults, with row upon row of equipment racks jammed into cages built of cyclone fencing and two-by-fours. PAIX would have equipment cages, too -- but they would be aesthetically correct. The lights snaking along the open ceiling would be museum-quality pin spots. No longer would the Internet's heart resemble the boiler room of a steamship. No, this would be the first-class deck, right down to the highly buffed blond wood of the doors and the sleek fit and finish of the cages.

Inside a vast machine

But no expanse of fine wood or filigree could make a visitor forget that he or she is inside a vast machine. For those unaccustomed to the highly regulated (temperature-controlled, dust-filtered, video- monitored) world of the exchange, the most striking aspect is the sound, the low hum of a thousand tiny equipment fans.

To understand the dynamics of what goes on in the cages, it's best to fall back again on metaphor. The Palo Alto Internet Exchange, like all other network access points, functions as a hub airport for data. Internet service providers from across the country and around the Pacific Rim are here. These firms (which sell Internet connections to consumers and, sometimes, to other Internet carriers) pay between $2,500 and $80,000 a month for space on the tarmac and baggage- handling services. Compaq's job is to facilitate the transfer of cargo from one carrier to the next. The exchange requires that each client have a speedy ramp (at the very least, a 10-megabit-per-second Ethernet port) to the shared central switch that connects all carriers. Companies can also strike side deals to route data directly between their cages, bypassing the central switch.

The largest of the 64 tenants at the data airport are the so-called ``Tier 1'' carriers such as UUNet and AGIS -- national and international ISPs that lease high-speed lines from long-distance phone companies to form the backbones of the Internet. Next are smaller regional and local carriers. Although these smaller players have space at the airport and exchange privileges, they must often pay their larger brethren to carry their data. Unlike most other major network access points, the Palo Alto exchange also rents space to a third type of client -- content providers.

Servers parked

A manufacturer or shipper that needs ready access to many airlines may choose to locate a warehouse on an airport frontage road. In the same fashion, nine large Internet content firms park servers at the Compaq exchange so they're directly accessible to many networks. Household names that pump data through Palo Alto include PointCast and Compaq's own search engine, Alta Vista.

The tenants share a single room about the size of a basketball court, which has been partitioned into cages, with each cage holding between three and 25 coffin-sized equipment racks. Although some racks are almost empty, all 218 are rented. There is a waiting list for space. Early next year Compaq will add 185 more racks and expand the exchange onto the first floor of the building.

An intricate latticework of precisely tied cables connects the routers and servers in the racks to the data pipes that run along the open ceiling. The really big pipes -- the13 ultra-high-capacity fiber-optic lines the phone companies lease to the largest Internet service providers -- can carry a combined total of 26.52 gigabits per second (the equivalent of about a half-million home modems all going at once).

Critical importance

In a small room 10 feet removed from the main cages, Hendriksen points to three bright orange plastic tubes coming out of the basement wall. Two of the conduits, about the size of vacuum cleaner tubes, continue on to a rack of equipment that breaks the data lines down into smaller lines. The last orange tube, which contains outdated copper wires, stops one foot inside the basement wall, where it has been crudely severed with a hacksaw. Like a weed-choked wagon trail that runs alongside the interstate, it is of historic interest only and leads nowhere.

Just as a systems failure or delay at one hub airport can disrupt traffic across an airline's entire operation, what happens in Palo Alto is of critical importance to all clients. Unless companies buy rack space at many exchanges, one problem with a single piece of equipment here can effectively take an entire ISP -- and all its clients -- down. For the network technicians who install and maintain the equipment, the cage city is, as they say, a ``24/7 culture'' -- 24 hours a day, seven days a week, someone is watching. When your entire company's fate hangs by a couple of OC-3 fiber-optic conduits, you don't break for dinner -- or sleep, or anything -- until the problem is solved.

Next to the elevator, just outside the main cage room, is a spartan alcove with three plump, upholstered chairs and a pair of oversize monitors displaying system stats and security information. This is the lounge where the network plumbers from out of town camp out, sometimes for days on end.

``We've had people jump on planes with no thought to where they'll stay once they get here,'' says Hendriksen. She points to a cage one aisle removed from the main drag. ``See that? Those four (technicians) are from NetRail in Atlanta. They're expanding their equipment. They've been at it since 9:30 last night.'' In the lounge, the only evidence of human habitation is three empty Pepsi cans, a ``Foam-N- Color Barbie'' doll and a John Le Carre novel.

Free exchange vital

Without cooperation and the free exchange of data between networks, the Internet would simply cease to be. But the face of cooperation today is very different than when the Net was an alliance of university networks. At the exchange points, many carriers sign treaties that enable them to trade data freely with all other signatories. At the same time, the exchange is a competitive marketplace where carriers cut side deals to exchange traffic one-on- one. Internet service is a dog-eat-dog business and nowhere is that more evident than here, where companies routinely place their mission- critical equipment within plain view of their most bitter rivals.

``I don't think of this as a particularly tense place,'' says Hendriksen. ``But, then again, we take a lot of steps to make sure that people behave themselves.'' So far there has been no instance of an overzealous tech ``accidentally'' fouling the wires of a neighbor. ``The escorts should take care of any malicious tendencies that anyone might have,'' she says. ``Anyone who goes into a cage is escorted. We've had people say `We tested your security. He went into the common-area cage and he put his thumb on my router unchallenged.' That's the level of silliness we're talking about.''

Think of an airport so competitive that the airlines go to elaborate lengths to disguise the markings on their planes and keep their flight schedules secret. At the Palo Alto exchange, many equipment racks are anonymous. Internet Protocol numbers (the numeric tags that identify equipment to the rest of the Internet) are blacked out.

``We have had some people who've gone a bit over the line, taking a little too much interest in stuff that's a couple racks away from their own,'' says Hendriksen. ``The escorts handle that. It's kind of a game -- because most of the ISPs can tell who the others are just by looking at how they rack their equipment and what's in the racks.''

Few human touches

There are few human touches to soften the mood. On one of ISP's racks, technicians have posted smiling, life-size cardboard cutouts of staffers. Another firm installed a Christmas tree last year.

``In the beginning, there was thought to allowing stuffed animals in the cages to give it a zoo-like feel,'' says Hendriksen. ``But that didn't work out. There was some safety requirement that the animals had to be a certain type of cloth and only brightly colored.''

In the basement, there is one machine that sits apart from the city of cages, in its own room, with its own security layer.

This is the GIGAswitch, the great sink to which all the data rivers great and small must flow. With its cables and ports, the shared central switch looks not entirely unlike a telephone switchboard out of a bygone era, an artifact from the time this building was young.

This is it, the very crossroads of the wired world. It is not metaphor. It is real -- the molecules of the digital world are millions of pulses of white light moving through the switch every second. Hendriksen smiles indulgently when the visitor kneels down and places a hand on its face.

``Is this it? The very middle?''

``You could say that.''

Until April 1995, the backbone of the Internet was in federal hands, under the auspices of the National Science Foundation. In that era, with four government-sanctioned Internet exchange points and two private ones, understanding the topology of the network was as simple as drawing a string map. When competing commercial carriers took over the backbones, maps and routes became much more complex.

Today, much of the traffic that used to flow through the public exchange points is routed through private interconnections. These ``private peering'' arrangements between ISPs can take place anywhere the transaction is mutually convenient.

Because of the secrecy involved in peering agreements, it's impossible to know how much is truly being shunted away from the exchanges. The consensus estimate is that two-thirds of all Internet traffic today does not flow through the common central switches of public exchanges such as PAIX, MCI WorldCom's San Jose facility and the PacBell NAP.

The string map has become a loosely woven fabric. Where there were once six grand junctions, there are now hundreds of smaller ones that never appear on any map. If the trend toward private one-to-one exchanges continues at the current pace, soon there will be no middle. When the center disappears, piece by piece, line by line, into a thousand unmarked equipment closets, the Net shall truly be hidden -- un-mappable and, ultimately, unknowable.

-- Diane J. Squire (sacredspaces@yahoo.com), December 26, 1998.



I am most grateful for the time and expertise you all have shared with me in answering my question. I am continually amazed that I have found such a group of people, and that they let ME in. Thank you all very much.

-- Ann Fisher (zyax55b@prodigy.com), December 26, 1998.

Moderation questions? read the FAQ