May 16, 2012

Doom how?

When invoking the Great Filter as an explanation for the Great Silence, we have yet to determine the exact nature of the filter. It's conceivable, though unlikely, that it resides in our past (fingers are crossed that this is the case). If so, the rise of prokaryotes and eukaryotes is probably what we're looking for.

But, if the filter resides in our future, the question needs to be asked, What is it exactly that prevents civilizations from embarking on interstellar colonization?

One of the stronger, though more disturbing suggestions, is that all civilizations destroy themselves before they can send out a wave of self-replicating colonization probes. For the sake of this particular argument, let's assume that doom is in fact the Great Filter. If this is the case, what could it be, and when would it happen?

It's probably not environmental devastation, as that's a weak force for something that's supposed to be  existentially catastrophic, nor does it seem universal as far as extraterrestrial civilizations are concerned. It's more reasonable to suggest, therefore, that something in our technological arsenal will destroy us. It's clearly not nuclear weapons, as we've figured out a way to live alongside their presence; there's even talk of disarmament. So, it has to be something we come up with in our future. And whatever that technology is, it has to be completely uncontainable and catastrophic.

Only two things come to mind: molecular nanotechnology and machine superintelligence.

Given that doom has to come before the launch of self-replicating probes, this indicates that we have to experience doom prior to the invention of diamondoid data storage and nanocomputing along with the requisite robotics and AI capacities; these are the ingredients to von Neumann probes. It also means doom before, or at the point of, the advent of strong artificial general intelligence (because an SAI could develop probe-enabling technologies). This would suggest that either (1) the onset of machine superintelligence is somehow causing the filter or (2) the precursors to probe-enabling nanotechnology are fatally catastrophic in all instances (e.g. weaponized molecular nanotechnology).

If this is the case, then I would expect doom no earlier than 25 years from now, but no later than 50-75 years from now.

There's also the possibility, of course, of a wildcard technology (either through convergence or something we haven't considered yet).

Again, I'm not suggesting that doom is certain — there are other non-doom explanation for the Fermi Paradox. I'm just venturing down this particular line of inquiry.


Robin Hanson said...

Super AI can't be a great filter if that AI then goes on to take over the universe.

Anonymous said...

Why is it unlikely that the filter resides in our past? There's certainly seems little to prevent life from emerging given the right conditions (we see plenty of life), but what is the evolutionary pressure for civilisation-capable intelligence? Compared to something like beetles the great apes (save for us) are all but extinct, showing that a precursor branch was more a dead end than anything. An organism that is physically well-adapted has no need for intelligence.

George said...

@Robin: You're suggesting that the same SAI that wipes us out could subsequently continue to act according to its goal structure, potentially leading to universal reach. Makes sense. Even if a small fraction of all Singularities results in an SAI that needs to planet hop, that's still significant.

The only way around it is the seemingly bizarre suggestion that *all* SAIs, for whatever reason, do not embark on interstellar missions. If so, how could that possibly be?

George said...

@aepxc: I don't disagree. For the sake of brevity I didn't want to get into that discussion.

John said...

Could the filter be the technology for uploading intelligence? A society once uploaded may cease forward progression. Could be the removal of biological drivers, or having access to a simulated universe without the limits of the physical one will cause a civilization to stop exploring the real one.

David Evans said...

An SAI may not have the human drive to spread copies of itself. If it did, there seems no reason why it should place them on Earth-like worlds with their problems of weather, corrosion and biological attack. There may be a surprise waiting for us on some airless planet.

John said...

SAI would probably select a world/location that lends itself to the ideal computing conditions. They would be looking for an environment that is ideal for heat dissipation to not burden their processors and access to solar energy.

ZarPaulus said...

@John: Or maybe uploaded intelligences eventually come to realize that they're just copies and erase themselves.

Maybe there's a bunch of Berserker probes using advanced stealth technology.

Or maybe the filter is just nuclear and/or biological warfare.

Prakash said...

What about the "retreat into virtual reality" explanation? Civilizations could slowly be using up their sun alone and generating interesting experiences and none of this is visible to us, who are far away from them.

This is even more possible if boredom as a concept didn't develop in the evolution of that species. They could literally be repeating the same experience again and again, not even looking for novel experiences.

George said...

Yes, I'm aware of other non-doom scenarios. This piece dealt exclusively with the possibility of doom, however.

Nato said...

I've often wondered if the cheapest solution involved self-replicating (to a certain density, around only "dark" stars) probes that accelerate slowly but steadily toward anything emitting signs of advanced life. The probes take hundreds of years to accelerate and cover the distance to the body needing sterilization, which explains why we're still here. Presumably, then a cloud of hailstones at some significant percentage of the speed of light will show up and make the crust molten again pretty soon. It would be a low enough energy event to where it wouldn't really be visible to other budding civilizations, and would be totally unavertable, as far as I can tell.

Ben Busy from the Bean said...

The great silence stems from the fact that advanced ETs aren't using the radio spectrum to communicate. Perfect neutrino information transmission and reception protocols and you will discover a galaxy of communicating civilizations. Radiowaves only extend 200 lightyears which is about 1/800th the area of the milky way galaxy.

Ben Busy from the Bean said...

Great Silence exists because aliens won't use radiowaves to communicate. Neutrinos more likely. I think we need to maintain our awareness of this fact. The Great Silence ... in so far as RADIOWAVE TRANMISSIONS ARE CONCERNED.

Interstellar Bill said...

When intelligent beings scale up from their primitive social beginnings, their chief peril is an ideological one:
an insane collectivism that is so horribly manifested here, politically as Socialism
and culturally as Leftism.

Had the U.S. stayed on a Constitutional course since the 1870's, we'd be at least 10 times richer than we are, and space would have been gloriously conquered decades ago.

But instead the siren call of Leftist fallacies was taken up from the 1880's onward, first by 'Progressivism' then by 'Liberalism', and the Constitution became toilet paper.

I bet 99 out of 100 techno-civilizations ended up voting themselves into the permanent poverty we're headed for.

How many times have you heard the Statists bitch about 'wasting all that money on space when we have so much poverty down here' (poverty which they caused!)

Anonymous said...

Ben Busy,

My thoughts too. We have barely a 100 years of communication with radio waves. The next 100 years could easily bring us some new technology already known and used by other advanced civilizations.

MyName said...

It seems pretty obvious. Newly arrived sentient species eat all the sugar in the petri dish before they move to an exoplanet. It takes a lot of energy to go to another star and we are busy consuming so much so fast that there won't be enough around to go anywhere interesting. It will all go to the comfort of the mediocre and the luxury of the elites.