The Singularity is the last trench of the religious impulse in the technocratic community. The Singularity has been denigrated as "The Rapture For Nerds," and not without cause. It’s pretty much indivisible from the religious faith in describing the desire to be saved by something that isn’t there (or even the desire to be destroyed by something that isn’t there) and throws off no evidence of its ever intending to exist. It’s a new faith for people who think they’re otherwise much too evolved to believe in the Flying Spaghetti Monster or any other idiot back-brain cult you care to suggest.Wow, sounds like Warren has some special knowledge of his own. I certainly hope that, aside from this vacuous and inflammatory post, that he'll begin to share some of his expert views into AI theory and the potential for machine minds.
Vernor Vinge, the originator of the term, is a scientist and novelist, and occupies an almost unique space. After all, the only other sf writer I can think of who invented a religion that is also a science-fiction fantasy is L Ron Hubbard.
With the bold claim that there is "no evidence" to support the suggestion that SAI is engineerable, I'll have to assume that he's engaged the principle thinkers on the matter and offered sufficient critique to dismiss their findings outright -- thinkers like Eliezer Yudkowsky, Ben Goertzel, Hugo de Garis, and the many others devoted to the problem.
Otherwise, why should we take Ellis seriously? Or are we expected to take his position on mere faith alone?
The day is coming, my friends, when Singularity denial will seem as outrageous and irresponsible as the denial of anthropogenic global warming. And I think the comparison is fair; environmentalists are often chastised for their "religious-like" convictions and concern. It's easy to mock the Chicken Littles of the world.
And like the foot-dragging on climate change, there are consequences to inaction. The bogus and unfair memetic linkage between millenarian beliefs and the Singularity is a dangerous one, and the sooner this association is severed the better.
As I see it, there are four strategies to help us normalize the Singularity debate:
(1) We need to better promote and engage respected thinkers and public intellectuals who are sympathetic to the issue -- key figures like Ray Kurzweil, Robin Hanson, Nick Bostrom, Marvin Minsky, etc.And in the meantime, don't buy into Ellis's empty anti-Singularity rhetoric, which is all that it really is.
(2) A new generation of public figures is required -- renowned individuals who are willing to a) put their reputations at stake and b) use their popularity/credibility to raise awareness and help with foresight.
(3) Continue to frame the issue as a scientific endeavor and pitch the various scenarios as hypotheses; we need to keep the language within the scientific vernacular.
(4) Let the critics have it and show them no quarter. Particularly when their denial is mere contradiction and driven by sheer incredulity. We need to force them to better articulate their positions while defending our own with as much evidence can be mustered.
I'm definitely disappointed to hear these comments from Ellis. His Transmetropolitan series is one of the best examples of post-cyberpunk fiction, and it's disheartening to see such a visionary author look at the issue so narrowly. I'm deeply offended by the comparison to Scientology.
Thanks for drawing my attention to this George.
"The day is coming, my friends, when Singularity denial will seem as outrageous and irresponsible as the denial of anthropogenic global warming."
Or, the Singularity will be seen as yet another failed guess at where the unpredictable currents of technology will take us. If that's the case, the Singularitists are, as Ellis suggests, grasping for faith.
And, I'm sorry, but your rhetoric in this post doesn't seem to be far divorced from the loons who say the End Times are nigh.
Oh come on, even Charlie Stross calls it the 'Nerd Rapture.'
Well, you know. Warren Ellis is a fantastic author, and usually a very clear-eyed thinker. He is also very grumpy and cantankerous. Sometimes the latter gets in the way of the former, and I think may be what happened here (dangerously treading on apologist territory here, but I do have a soft spot for W.E.). Taking into consideration his works of fiction (particularly Transmet, Mark), I'd love to hear him expand on his very short post.
That being said, wonderful rejoinder, George, and spot-on. The post, particularly your four-point proposal, seems like the opening salvo in a longer, more fruitful conversation.
On the one hand, I don't find arguments for accelerating development of advanced AI software or brain emulation particularly problematic. Sure, it ISN'T here yet, and thus should be considered with some skepticism and level-headedness.
On the other hand, packaging these development projects into an eschatonic "singularity" and blue-skying it to death is just asking for less-disciplined people to treat it in just such a religious, faithful, uncritical fashion.
Yes, the church of the singularity is out there. Its adherents are just as kooky and irrational as they are for any other dogmatic faith, and there's nothing that cooler heads can do to calm them down, no matter how much they'd like to project a more rational, scientific facade for their... "movement".
So, one has a choice as a convinced skeptic: one can take critiques of religiosity as challenges to what you believe directly, or you can look behind you, realize that yes, critics like Ellis are talking about a real phenomenon that you're not actually participating in, despite that you have reached similar conclusions.
I tend to avoid the hyperbole of the singularity by concluding that, while these techs sound plausible to develop, the politics won't form around them in any way that could allow us to construe them as messianic or "saving" us from anything - especially ourselves.
That said, I don't think "No evidence" is a justifiable statement.
It's not that I'm offended. I like to laugh at robot cultists, too. I'm just not convinced they're completely wrong.
I'm curious -- how is this rebuttal any more meaty than Warren Ellis' original post? You're basically saying, "Phhhbllt! Wrong wrong wrong!" without providing any information.
Honestly, the tone reminds me of coming into one of those arguments between Creationists and people with brains.
@The Prof - if no specific argument is advanced, there's nothing to rebut. Thus there's little George can do but point out the ways in which Ellis' post is all asseveration and no argument, then leave it at that. The next step is to outline everything regarding Singularity all over again. And yes, in that way it's very like responding to the non-sequiturs bandied about by the Creationists.
Regarding SAI: I'm not really positive that SAI is as possible as people seem to assume it to be. I'm sure that we will eventually make ourselves vastly more intelligent in a number of ways than we are today, but there may well be practical limits (from math as much as physics) on the cognitive interaction space. That is to say, the "super" part of intelligence might not be quite so unimaginably different from current extrapolations. We don't really have any criteria (as far as I know) to choose either way as of yet. Some limitations in our brains - being poor at arithmetic, for example - seem to spring from it not being much of an evolutionary advantage and should be fairly easy to add, while research into memory and learning suggests that perfectly eidetic memory may never be workable because some degree of forgetting is crucial to cognition.
I mention this because even people who should (I think) know better seem to assume that the "us" of the future will be so qualitatively different as to be unrecognizable, but it's perfectly coherent* to think that in many critical ways we'll always be the same, no matter what mental accouterments we've added to our selves. This can lend itself, I think, to some dubious handwaving that the uncharitable could easily misread.
*And consonant with a variety of interpretations of "Singularity"
I disagree with your #4. We don't need "no quarter" against "vacuous critics". What we need are better critics. Your very approach buttresses Ellis' criticism. It indicate a desire to win more than a desire to have the best models of reality and to best minimize existential risk. Shame on you for that, Dvorsky.
@Nato: "Yes, the church of the singularity is out there"
- where? who are these mysterious deluded singularitarians, who, as opposed to the rational voices of bostrom, yudkowsky, hanson, annisimov, etc, make wild predictions based on no evidence? Can you post a link to one of these people?
I think that the "deulded church of the singularity" may well be a internet myth. Maybe there were people around like this in the 90's, but I cannot find them today.
Hanson and Bostrom are world-class rational thinkers but I do not ascribe great rationality to the other two names you mentioned.
In any event, I realized that 'rationality' is not what I was looking for.
The attraction to 'Singularitarians' of rhapsodizing about the potential of super-intelligence comes from the fact that they fondly imagine themsleves to be partial potential super-intelligences, whereas, in reality, it's only a few ordinary humans *pretending* to be super-intelligences and making complete arses of themselves in the process.
Believe me, a real super-intelligence won't see much difference between say Nick Bostrom and Joe Bloggs.
We're not super-intelligences so wondering around rhapsodizing about what super-intelligent existence may be like to us is actually quite useless to us. Believe me, a real super-intelligence is unlikely to pay any attention to us or our ideas whatsoever.
I estimate there's a grand global market for the 'hyper-rationality' of 'Singularitarians' of perhaps 500 people.
On the other hand, conscious emotional experiences such as playfulness, the sense of wonder and amusemenet are sought by 100's of millions.
My advice: the hyper-rationality offered by 'Singularitarians' will do you no good what so ever. Go out and seek good conscious experiences instead! That's what I realized I was really looking for.
The problem with the Singularity is spotting when you are in it. To be honest I suspect that we already are.
If you were a two dimensional being living on a three dimensional sphere, your world would be indistinguishable from flat. You would be able to infer the existence of a third dimension as travelling in one direction for long enough would bring you back to your point of origin.
The trouble is no such analogy exists for living in the singularity curve. From any given point along it, it wont look that steep. In fact it is quite possible to argue that we've been in the technological singularity since the invention of the abacus four and a half thousand years ago, technology has been largely improving at an exponential rate since then.
And here we are with mobile phones that are smaller and work better than anything Captain Kirk could have imagined. Here we are with bandwidth access increasing and bandwidth price decreasing and an unending world of new information at our fingertips. Here we are with not only the human genome mapped but the ability to map individuals in decreasing times and prices.
A lot of people define the singularity around it being a state where we are no longer able to predict what will happen in the immediate future because of the rate of technological improvement. Are you sure that is not where we already are?
A better set of criteria for defining what the singularity is are required. Some tests for it's presence are required. You might find it surprising how many of them fit the bill now.
Jamie: Three Major Singularity Schools
Post a Comment