March 26, 2009

Singularity? Schmingularity? Are we becoming gods?

David Brin is guest blogging this week.

...greeted with hand-rubbing glee by fellows like Ray Kurzweil and the "extropians" who foresee transformation into higher, smarter, and more durable kinds of beings.

Needless to say, many people have ambivalent feelings about the Singularity. As I describe in the essay, “Singularities and Nightmares: Extremes of Optimism and Pessimism About the Human Future", some fear the machines will stomp on their makers. Or else crush our pride by being kind to us, the way we might pat a dog on the head.

Others feel that humanity may get to come along, accompanying our creations through the wild ride toward godhead, as I illustrate in one of the few post-singularity science fiction stories, "Stones of Significance."

(At the same site see other short stories, plus the provocative "Do we really want immortality?")

Meanwhile, others urge that we reject the coming changes, or else claim that we'll have no choice. That this Singularity thing will turn out to be a failed dream, like all the other promises of transcendence that were sung about by previous generations of mystical romantics.

Indeed, one thing about all this fascinates me -- that personality generally overrides culture and logic and reason. More and more, we are learning this. Somebody who would have been a grouch 500 years ago is likely to be one, today. The kind of person who would have been a raving transcendentalist in Roman days, foretelling a God-wrought ending time - either in flames or paradise - would today be among those who now prophecy either world destruction or redemption... by means of science. The envisioned means change, but the glorious vision of doom or glory do not.

Oh, what is a pragmatic optimist to do? We are beset by exaggerators! When what we need to moderate, step by step action... adamant, radical, even militant moderation! Progressively pursuing all the good things without allowing our zealotry to blind us to the quicksand and minefields along the way. Simplistic dogmas are dumb, whether they are political or techno-transcendentalist. It is pragmatists who will be best suited to negotiate with the rising AI entities. And it will be those who emphasize decency, not dogma, who teach the new gods to be pleasant. To be people.

And that's a VERY brief commentary on perhaps the greatest issue of our time. Wish I had more time. But I'll be commenting furthe from time to time, at CONTRARY BRIN.

====

Oh, for some cool recent science fiction about the near future, see my stories "Shoresteading" and "The Smartest Mob"

=====

NEXT... George says: "A number of years ago, David Brin contacted me to bring me up to speed on his efforts to raise awareness about the active SETI approach, also known as METI (messages to extraterrestrial intelligences). Brin argues that human civilization is not ready to call attention to itself – at least not yet -- and that we should engage in a broader discussion before doing so.

Brin writes,
'Let there be no mistake. METI is a very different thing than passively sifting for signals from the outer space. Carl Sagan, one of the greatest SETI supporters and a deep believer in the notion of altruistic alien civilizations, called such a move deeply unwise and immature....

'Sagan — along with early SETI pioneer Philip Morrison — recommended that the newest children in a strange and uncertain cosmos should listen quietly for a long time, patiently learning about the universe and comparing notes, before shouting into an unknown jungle that we do not understand.
"Brin invited me to join a closed discussion group where this issue is examined and debated. The purpose of the exercise is to not just think more deeply about this issue, but to also raise awareness and possibly prevent a catastrophe (alien invasion perhaps?). Essentially, Brin argues that METI needs to be strongly considered before any group or individual takes it upon themselves to shout out to the heavens. He is particularly concerned how some groups, including SETI, are dismissive of his concerns. His fear is that someone will unilaterally decide to start transmitting messages into the depths of space.

'I was unsure at first about whether or not I should join this group. As a contact pessimist I’m fairly certain that the fear about a METI approach is unwarranted -- not because ETI's are likely to be friendly, but because no one's listening. And even if they are listening, there's nothing we can do about it; any advanced ETI that's on a search-and-destroy mission would likely have the 'search' aspect figured out. I'm not sure how any civilization could hide in the Galaxy. Consequently, METI is somewhat of a non-issue in my opinion.

'That being said, however, I did reach the conclusion that there is a non-zero chance that we could run into trouble should we change our approach from listening to messaging. For example, resident berserkers could be waiting, for what ever reason, for this sort of change in our radio signals. Perhaps they are waiting for a sign that we've passed a certain developmental threshold.

'I think this argument is extremely weak and improbable, but it's not impossible; it should not be ruled out as a potential existential risk.

'Which leads me to the precautionary principle. Since no one is listening, there is no harm in not sending messages out into the cosmos. Again, if a friendly ETI wanted to do a meet-and-greet, they should have no trouble finding us. But because there is the slim chance that we may alert a local berserker (or something unknown), we should probably refrain from the METI approach for the time being."


Thoughts? Don't leap to conclusions! Read up: "Shouting at the Cosmos."

2 comments:

  1. > prophecy either world destruction
    >or redemption... by means of science.

    Ever since the advent of Mutually Assured Destruction, can a sane, non-raving person, grouch or not, disagree that "world destruction by means of science" is an non-negligible possibility?

    ReplyDelete
  2. no -- pragmatists, or any others will not hold a candle to “the rising AI entities;” very wishful thinking at best; as for the new gods, we will be almost, if not completely, unable to fathom them; we are setting into motion something that will at first overcome us, and then replace us; what will actually be left us us, is anybody’s guess, but there will be nobody’s will to determine what will happen

    re: meti; sounds prudent to me; we have so much yet to learn, and we cannot discount the possibility of surprise, or the unexpected, especially when it comes to existential risk

    we are NOT becoming gods; we are letting our already over-inflated egos get ahead of us

    ReplyDelete

Note: Only a member of this blog may post a comment.