Previously in series: The Ethics of Simulated Beings, Descartes's Malicious Demon, The Simulation Argument, Kurzweil's Nano Neural Nets.
As shocking as the Simulation Argument is, it's (arguably) a revelation that's no less shocking than previous existential paradigm shifts. While undoubtedly disturbing to the people alive at the time, previous civilizations have come to grips with the knowledge that they do not live on a flat Earth nor at the center of the Universe.
Like the simulation argument, these previous scientific epiphanies assaulted humanity's sense of itself and its cosmic importance within the Universe. But just as it no longer troubles us to know that we don't live at the center of the Universe, it shouldn't bother us to know that we don't reside in the deepest reality. While it's tempting to diminish the "realness" or the validity of a virtual world, so long as certain attributes of existence exist, there's no good reason to value one realm over another.
This being said, there are a number of unanswered questions about the type of simulation we could be living in—answers to which could have a profound impact on our self-conception.
We do not have the means yet to determine whether or not we live in a simulation, let alone the means to determine its potential type and nature. But this hasn't prevented serious speculation; we may be able to describe and categorize the possible simulation types and varieties of virtual life:
Hard and soft simulations
The possibility exists, for example, for what philosopher Barry Dainton describes as hard and soft simulations. Hard simulations result from directly tampering with the neural hardware ordinarily responsible for producing experience whereas people running in a soft simulation have no corporeal source—they are exclusive streams of consciousness generated by computers running the appropriate software; there is no external hardware support.
The inhabitants of The Matrix had bodies that existed outside of the simulation, thus qualifying it as a hard simulation. Sensory experience could be directly machine-controlled through the stimulation of the appropriate areas of the sensory cortex and the movements of the simulated body would be under the control of the source mind, but there would be no need for the source body to actually move. As Morpheus noted, "What is real? How do you define real? If you're talking about what you can feel, what you can smell, what you can taste and see...then real is simply...electrical signals interpreted by your brain."
Complete and partial simulations
There's also the possibility for complete and partial simulations. In a complete simulation, every element of the experience is generated by artificial means (e.g. the complete suppression of all psychological characteristics (including memory) in favor of novel ones) .
But in a partial simulation only some parts or aspects of the experience are generated artificially (e.g. the person retains their individual psychology).
Active and passive simulants
Dainton also describes active and passive simulants. Actives are completely immersed in virtual environments, but they are in all other respects free agents—or, as Dainton concedes, free as any agent can be. Their actions are not dictated by the program, but instead flow from their own psychologies, even if these are machine-implemented.
Passive subjects, however, have a completely preprogrammed course of experiences. "The subjects may have the impression that they are autonomous individuals making free choices," writes Dainton, "but they are deluded." All their conscious decisions are determined by the program. They have apparent psychologies, and are conscious, feeling agents, he notes, but their real psychologies are entirely suppressed or nonexistent.
Original and replacement psychologies
Other varieties of simulated life include subjects who have either retained their original psychologies or are given entirely new ones. In an original psychology simulation, a simulant has an external existence outside the simulation and retains their original psychology -- again, The Matrix provides a good example. But in a replacement psychology situation, the simulant has external existence, but none of the original psychology is retained, only consciousness is transferred.
Communal and individual simulations
Simulation experiences could also be communal or individual.
Communal simulations have a virtual environment that is shared by a number of different subjects, each with individual and autonomous psychological systems.
In an individual simulation, however, there is only one real subject with an autonomous psychology; the other "inhabitants" of the simulation are merely automatons, parts of the machine-generated virtual environment. Communal and individual simulations could also be combined, where 'real' psychologies are intermixed with automatons. This scenario is (somewhat) explored in the 1999 film, The Thirteenth Floor.
Which leads to the next level of complexity, the idea that these simulation types could be mixed and matched. Indeed, if powerful simulation technology were to be commonplace it is by no means inconceivable that these simulations, particularly those of the hard variety, would be generated in sufficient numbers.
One thinker who has thought of the various different combinations is Tony Fleet. While there are as many as 32 different combinations, he argues that only 9 of them are viable and/or logically consistent. For example, in a partial simulation scenario, an external entity it required -- therefore this is only possible in the hard simulation case; a partial soft simulation is therefore impossible.
Fleet speculates that the only viable combinations can involve the communal/active; individual/active, individual/passive simulation types (be sure to check out his tables). That said, he does not believe that we've covered all simulation types. For example, there is no distinction between physical, virtual and mixed simulations. Some more work clearly needs to be done to create a complete simulation taxonomy along with all logically consistent combinations.
This opens the door to some remarkable possibilities. How might these simulations and virtual reality experiences be utilized by our descendants, or even our future selves?
It's conceivable that people might take virtual reality 'trips' to the past quite frequently. They would also likely be used on an occasional basis during history lessons for those with a particular interest in experiencing what it was like to live during certain periods of the past (Bostrom's Ancestor Simulations come to mind).
But such trips might also be taken for entertainment purposes. A future activity in a posthuman world might very well involve regular immersive and interactive journeys into simulated realities. And in order to increase the authenticity of such adventures, it's quite possible that posthumans may choose to temporarily suppress their psychologies and memories. Of course, they would recall the entire experiencing after re-awakening in their genuine reality as their authentic selves.
Which means that you might actually be an autonomous simulant with a replacement psychology living in a hard simulation.
And if that's the case, now what? How are you supposed to live?
A topic I'll return to in my next post.
"Which means that you might actually be an autonomous simulant with a replacement psychology living in a hard simulation."
However one question about the hard / soft option.
I have thought of the top layer as a hive mind (think nice borg) that does have some sort of physical presence in some sort of physical universe or multiverse. Obviously it's not a human brain but something that has the functional capacity of cognition but built on something more robust and long lasting and with more capacity. This hive mind can run separate simulations (one or many) and create many temporary mini-consciousnesses (us) within this world. These can experience the simulation with no memory of the past hive mind existence and then return to the hive mind with collection of life experiences to put into the collective.
Does this count as a hard or soft option?
I think what you're describing, which sounds like a collective intelligence that's running a subset of simulations, might deserve a category of its own. But because you're describing a kind of ethereal consciousness that is uploading and downloading back and forth, it may be best described as a soft simulation.
Zytheran, how does that differ from the oversoul theory of the early 19th-century transcendentalists?
I don't think it does, in any meaningful way.
It seems to me that if you believe you are living in a simulation, you have three choices. Choice number 1: live your life like a normal person. Choice number 2: Keep talking about how you're living in a simulation. Soon enough you'll say that to the wrong person and you'll end up on some powerful meds. Choice number 3: take your life; you'll find out instantly whether you were living in a simulation.
My question is, if we really believe we're in a simulation, how does that change our goals and our actions?
I don't think it does. If we're in a simulation, the people running it aren't interested in communicating or helping us out. We're stuck striving to be happy and advancing humanity past the hedonistic treadmill state using our own devices.
Go Democrats: More like you have a *chance* to find out whether you're in a simulation or not. If the simulation also includes reincarnation or an afterlife, or you die for real upon dying in the simulation and there is no afterlife in reality, or you are a "soft simulation" which has no body in the real world and you just get deleted, then you won't learn anything.
[Also, why is it no longer giving me the option to post under the name I was using before? I don't have a google/blogger account and don't want to give out my LJ identity just anywhere.]
I've disabled anonymous commenting.
"And in order to increase the authenticity of such adventures, it's quite possible that posthumans may choose to temporarily suppress their psychologies and memories. Of course, they would recall the entire experiencing after re-awakening in their genuine reality as their authentic selves."
"These can experience the simulation with no memory of the past hive mind existence and then return to the hive mind with collection of life experiences to put into the collective."
I find this aspect of it very interesting because it requires you to consider how the brain stores experiences. I would wonder what kind of results you would get experiencing something with a part of yourself suppressed and then to re-integrate that into your un-suppressed self later. It might seem like you would had the experiences of another person in your head because the way you remember something depends on your own psychology when you experienced it. I'm not saying it wouldn't work at all, but it would pose some interesting challenges.
Regarding the same lines Odin Xenobuilder highlighted above - wouldn't you practically be creating a new person if you wanted to experience historical/fictional simulations without knowledge that they're not real?
Because I think to achieve that, you'd need to do a lot more cognitive editing than just suppressing parts of your psychology/memories, you'd need at least some additional fake memories/cognitive mechanisms.
And imagine that you wanted to experience what it was like to be in the trenches of WWI during combat. Unless your cognitive prep-programming included some strong buffers/editors, you might find the experience extremely unpleasant, especially if you had no idea it was not happening for real, that the people around you aren't actually dying and that you'll still be alive afterward.
Wouldn't it be more like creating a new person who would be made to suffer needlessly than like playing a video game nowadays? I think a lot of the fun in playing such games is the knowledge in the back of your mind that it's actually not a real experience.
And I do realize that the line between editing your own personality and creating a new person is quite blurry.
I think Ancestor Simulations fly directly in the face of negative utilitarian thinking: you are deliberately creating consciousnesses and then inflicting staggering amounts of pain on them. It shouldn't matter that those consciousnesses aren't physically insubstantiated--the suffering of a simulated 13th century Frenchman shouldn't be any less important than the suffering of a real chimpanzee. It follows that EITHER the Simulation Argument holds OR the elimination of suffering is a universally-shared transhuman belief, but not both.
Many times I've covered this topic in articles and videos, and while a cat only has nine lives this argument can be simulated an infinite amount of times. Seriously, if you have seen Simulation hypothesis 1 and Simulation hypothesis 2 there's very little here that will be new to you. Ive resurrected this body of work on the basis of a new sims game launching this month.
For those of you that are new to the simulation hypothesis argument, Ill provide a brief history.
Nick Bostrom the Director of Oxford Future of Humanity Institute wrote a paper in 2002 called ARE YOU LIVING IN A COMPUTER SIMULATION?
This paper posited that one of the three following arguments must surely be true;
1)Almost no civilization will reach a technological level capable of producing simulated realities.
For this argument to be correct it would mean that reaching a technological singularity is somehow restricted by the physical laws of the universe.
On the other hand this argument may be true if the size of the universe was infinite. This would also imply that our current observable universe is just a "local" membrane or bubble. Why is the size of the universe a deterrent? Well, if the universe is truly infinite then it would take a computer with infinite computational ability to simulate the entire universe.
Yet, as stated if the universe is sliced up into membranes simulating the entire universe may not be needed to provide a realistic experience to the simulants.
2)Almost no civilization reaching aforementioned technological status will produce a simulated reality, for any of a number of reasons, such as diversion of computational processing power for other tasks, ethical considerations of holding entities captive in simulated realities, etc.
This argument would imply that development of technology is not undertaken due to ethical implications. I have yet to see this in real life . If you look at historical reference there's never been a technological road that was chosen to be left unexplored due to ethical implications. This argument also assumes that all sentient lifeforms with the necessary technology to run a simulation of the universe are all ethical and choose not to run them.
On the other hand it, can be seen that giving the gift of life whether it's virtual or not is still a gift. The lifeforms that possess the technology, most likely artilects, could see creating sentient life in simulation as only a natural part of existence, especially if they themselves are simulated.
3)Almost all entities with our general set of experiences are living in a simulation.
This one is a very difficult one to argue, yet at the same time it is what most religions have told us about the true nature of the universe. In Hinduism our physical existence is called maya or an illusion. Is the illusion that we are simulations the concept that has been passed down by the ancients?
Simulation Hypothesis video 1
Simulation Hypothesis video 2
@heresiarch: I agree, many of the scenarios I describe are ethically questionable. If true, those running the simulation(s) are creating a disproportionate amount of suffering; this could potentially be seen as a strike against the Simulation Argument and/or the idea of a posthuman future devoid of suffering.
As other commenters pointed out, the simulation scenarios are indistinguishable from godlike beings that create/dream/manipulate reality. So except for the vocabulary, the concept is not new, nor does it call for novel human responses.
If it ever becomes technologically feasible to become totally immersed in a simulation and retain control of the process, people will take to it the way they have taken to movies and video games. If it's done forcibly, it will be the next stage in refining torture and/or brainwashing.
great post series exploring the intersection of technology and philosophy. there are many films dealing with this theme, or variations on it, my favorite being "The Thirteenth Floor".
Since this is a purely speculative issue, it's a lot like the old European monks debating how many angels can fit on the head of a pin. However, unlike that subject, the simulated world theory has immense value in instigating thought on how we should live in general.
From a Buddhist perspective, my answer is that regardless if we are "real" or A.I. agents in a simulation, our experiences, suffering, joy, desire, etc. are still real to us. The Buddhist practice path of understanding ourselves and experiencing the present moment as it is does not change.
Post a Comment