Monday, July 10, 2006

Here's some comments on The Singularity (the future or end of humanity), the Great Silence (what, no noisy aliens?) and SETI, and technology growth curves. A bit of background material if you haven't seen this all before:

The $40 trillion article:

Back before the article:

(Vinge's "A Fire upon the Deep" proves he is a worthy successor to the Asimov style - a fun summer read if you are looking for a little beach material and love techno SciFi - btw if one believes in the Singularity it certainly makes the 2100+ future portrayed by Star Trek type SciFi as silly).

Back to the Future: The May 2006 Singularity Summit:

Great Silent Thoughts on why SETI is failing:

So why can't we spot those aliens with radio telescopes? Once a species achieves the Singularity and transcends biological forms, it becomes child play to produce damping technology so they are not broadcasting 24-7 "we are here" - come get us to anybody -who might be superior and nasty about it, from the dark side of the Singularity (of course post-Singularity there may be no child play since no children and no 24-7 since time may not be relevant any more). So we are not going to find aliens with radio telescopes.

As we approach the Singularity, it will be clear that some nationalities, races, groups etc. would reach and benefit from the Singularity first. The ones who are excluded may attempt to knock SkyNet or whatever the machines-can-learn-and-think-way-faster than-biologically-based-humans technology base is called, off the map. For the lead groups this becomes the key - getting over the Singularity hurdle and into a stable, safe spot before getting shot in the back from those falling behind. As well, just like the Soviet Union took extraordinary espionage efforts to obtain atomic bomb knowledge in the late 1940s, those falling behind will attempt to get Singularity knowledge any way they can. So counter-terrorism and counter-espionage becomes critical, and far more important than traditional military and territorial based efforts.

One should not assume that North America will reach the Singularity first. As one of many example scenarios, suppose that Japan might hit it first with a combination of top research, funding, skilled design, and total commitment. Then what does the United States do - accept it and give up super power status to Japan, or decide it poses a threat? Sadly, human nature has been to use various versions of war to resolve power shifts.

So how fast will it be to reaching the Singularity? In 1967 as I child I went to Expo 67, and one of things I remember from it was the video phone display - a phone with a TV screen and camera that would allow you to see and talk to the caller live, and for the caller to see you (this produced lots of jokes about needing makeup and proper attire to answer a mere phone call). For years afterward I waited for this technology to be mass implemented, but nothing happened, and the only real progress was touchtone phones (whoopee sheet) - it appeared we were not that fast in implementation. However now I use a cell phone smaller than a Star Trek communicator, and that provides the screen, the camera, and lots more in a wireless go-anywhere package. This is one of Kurzweil's points - we overestimate what can be achieved in the short term (implementation is a bitch) but underestimate the long term (linear thinking).

One theory I hold to is that pressure produces. For example in World War II, there were huge advances in technology over the course of a few years, driven by a life-and-death struggle, including the atomic bomb, radar, and jets. When we get close to the Singularity, the pressure will be enormous to produce or perish. One could be deciding the fate of one's children, whether they would be in a second class backwater nation or in the lead of advancements.

So at first I think business and life will be as usual for a considerable length of time, say 20 or 30 years. Computer technology will continue to advance at a great pace, medical knowledge will grow, and new technologies will be introduced.

Then at some point, the key enabling technology will arrive full throttle - say, for example, the nanobots that Kurzweil discusses. This would then make the Singularity quite feasible in the short term, and now the pressure is on. We could then expect rapid advances and a quick realization of the Singularity, and attempts by some to stop others from reaching it. These attempts may involve the same enabling technology - for example unleashing a hoard of nanobots to obliterate a certain group, or to stupefy them (assuming reality TV fails in this venture).

So in the medium term, I suggest not making business and life decisions based on the possibility of the Singularity. However when you see the enabling technology come about, it will be time to dump all the traditional investments and plans, and perhaps even head for the bunkers for a couple of years. Hopefully we don't all destroy ourselves trying to reach the Singularity, and the Singularity means a Single survivor (rewriting Phillip K. Dick's Second Variety {movie the Screamers} to have nanobots instead of bots in an apocalyptic mindscape instead of a desolate landscape).

Of course nobody knows what "life" will be post-Singularity, though we see some fiction (Canada's award winning Mindscan among others) tackling it, in part to raise the issues for thoughtful consideration before the final mad rush to the Singularity. However regardless of the fiction, it is hard at this point to imagine life in the server banks.