It's always something or something else

greenspun.com : LUSENET : TB2K spinoff uncensored : One Thread

For those of you who feared that there was nothing else to worry about.

Self-replicating robots

Not to mention biotech. It's always something.

Best wishes,,,, ,

-- Z1X4Y7 (Z1X4Y7@aol.com), March 13, 2000

Answers

.....and here I was worried about the cloning of the wooly mammoth.

nancy

-- NH (new@mindspring.com), March 13, 2000.


Monday, March 13, 2000, 08:33 p.m. Pacific

Co-founder of Sun Microsystems sees doom in technology

by Joel Garreau The Washington Post

A respected creator of the Information Age has written an extraordinary critique of accelerating technological change in which he suggests that new technologies could cause "something like extinction" of humankind within the next two generations.

The alarming prediction, intended to be provocative, is striking because it comes not from a critic of technology but rather from a man who invented much of it: Bill Joy, chief scientist and co-founder of Sun Microsystems, the leading Web technology manufacturer.

Joy was an original co-chairman of a presidential commission on the future of information technology. His warning, he said in a telephone interview, is meant to be reminiscent of Albert Einstein's famous 1939 letter to President Franklin Delano Roosevelt alerting him to the possibility of an atomic bomb.

In a 24-page article in Wired magazine, to appear on the Web tomorrow, Joy says he finds himself essentially agreeing, to his horror, with a core argument of the Unabomber, Ted Kaczynski - that advanced technology poses a threat to the human species.

"I have always believed that making software more reliable, given its many uses, will make the world a safer and better place," Joy wrote. "If I were to come to believe the opposite, then I would be morally obligated to stop this work. I can now imagine that such a day may come."

Joy enjoys a level-headed reputation in the industry.

"Nobody is more phlegmatic than Bill," said Stewart Brand, an Internet pioneer. "He is the adult in the room."

Joy is disturbed by a suite of advances. He views as credible the prediction that by 2030, computers will be a million times more powerful than they are today. He respects the possibility that robots may exceed humans in intelligence, while being able to replicate themselves.

He points to nanotechnology - the emerging science that seeks to create any desired object on an atom-by-atom basis - and agrees that it has the potential to allow inexpensive production of smart machines so small they could fit inside a blood vessel. Genetic technology, meanwhile, inexorably is generating the power to create new forms of life that could reproduce.

Joy is deeply worried that these technologies collectively create the ability to unleash self-replicating, mutating, mechanical or biological plagues. These would be "a replication attack in the physical world" comparable to the replication attack in the virtual world that recently caused the shutdowns of major commercial Web sites.

"If you can let something loose that can make more copies of itself," Joy said in the interview, "it is very difficult to recall. It is as easy as eradicating all the mosquitoes: They are everywhere and make more of themselves. If attacked, they mutate and become immune. . . . That creates the possibility of empowering individuals for extreme evil. If we don't do anything, the risk is very high of one crazy person doing something very bad."

The huge profits from any single advance that may seem beneficial also concern him. "It is always hard to see the bigger impact while you are in the vortex of a change," Joy wrote. "We have long been driven by the overarching desire to know - that is the nature of science's quest, not stopping to notice that the progress to newer and more powerful technologies can take on a life of its own."

Finally, he argues, this threat to humanity is much greater than that of nuclear bombs because such weapons are hard to build. But, he says, these new technologies are not hard to come by. Therefore, he reasons, the problem will not be "rogue states, but rogue individuals."

Joy knows to some people, this may all sound like science fiction. "After Y2K didn't happen," he said, "some people will feel free to dismiss this, saying everything will work out."

Joy is less clear on how such a scenario could be prevented. Asked how he personally would stop this progression, he stumbled. "Sun has always struggled with being an ethical innovator," he said. "We are tool builders. I'm trailing off here."

-- doomer (@ .), March 13, 2000.


It's always something....rosanna rosannadanna

nanotech

Come on baby...let the good times roll!

-- Bennanoing (xx@yy.ed), March 13, 2000.


There are always a number of speculative technologies in the talking or preliminary investigative stages. Nanotechnology is being actively researched, but is a long way from reality. A year or so ago, some scientists at IBM succeeded in arranging individual atoms to form a crude facsimile of the IBM logo. That has been the high-water mark so far. At least, publically.

There are several orders of magnitude difference in complexity between developing a process that can spell a logo in a lab (at something like $US 100 million per letter) and processes that can construct a nano-machine cost-effectively. It may happen. It may not.

The capital investment required will be staggering either way. Most likely only the US government can afford to plow tens of trillions of dollars into such a costly venture. It is an open question if we can do it, fail, and still survive.

OTOH, genetic engineering is here, the techniques are proven and they are already at work in mass production. My belief is the dangers of unleashing the self-seeking short-sightedness of the profit motive onto the foundation of all life on earth is FAR greater than the decades-remote threat from nanotech.

As the engineers in my department are fond of observing: what could possibly go wrong?

-- Brian McLaughlin (brianm@ims.com), March 13, 2000.


Brian:

I am a molecular biologist. Still, I only use recombinant technology as a research tool. Products that we have released [as recently as this year] were developed using traditional technology. Why? Most of the phenotypes of interest are multigenic and the technology to use them is in the future. The commercial products that we have seen so far [eg, roundup ready soybeans] were constructed using a single gene transfer [the construct of course is more complicated than that].

In IT, it is possible to envision non-metal based hardware consisting of self-replicating biomolecules which can redesign circuits as needed. I suspect this was the subject being discussed. I would agree that this is far in the future.

As to biotech, the public discussion is so embedded [the embedded problem again] in politics that a reasonable discussion is difficult. From where I set, commercialization of the kind of things being publically discussed is further in the future that you think it is.

Best wis

-- Z1X4Y7 (Z1X4Y7@aol.com), March 13, 2000.



First time I saw this thing about self building robots scared me I still hate the tought. This can be a very bad thing for us all.

-- ET (bneville@zebra.net), March 13, 2000.

He views as credible the prediction that by 2030, computers will be a million times more powerful than they are today.

Yeah but the same old idiots will be programming or using them. They'll reach a stage when sheer computing power will not be an issue (i.e. memory and MIPs will be irrelevant to most users).

He respects the possibility that robots may exceed humans in intelligence, while being able to replicate themselves.

Not difficult given the current level of IQ on this forum.

Can't see how they would replicate themselves (they would need to be highly mobile, adaptive and dextrous, qualities not even remotely found in robotic machinery

not by 2030 anyway too soon

-- Sir Richard (richard.dale@unum.co.uk), March 14, 2000.


They'll

I meant There will

-- Sir Richard (richard.dale@onion.com), March 14, 2000.


>> Not difficult given the current level of IQ on this forum. <<

Sir Richard, there is an old American saying you may have heard: put up or shut up.

To express myself a bit less crudely: consider the difficulty we other participants in this forum might have in determining your true level of intelligence, if we confined ourselves the evidence presented in this forum. Furthermore, consider your predicament, if you were required to demonstrate the utmost pinnacle of your intelligence each time you posted here or risk a similar judgement. Then, having considered this carefully, I suggest you revise your judgement.

-- Brian McLaughlin (brianm@ims.com), March 14, 2000.


The title of this thread bears a very close resemblance to the title of Gilda Radner's book, written when she was diagnosed with uncurable ovarian cancer.

-- Anita (notgiving@anymore.thingee), March 14, 2000.


Moderation questions? read the FAQ