Virtual reality environments and MMORPGs are giving us the first clue that this may be a problem. Take Second Life, for example, which is already experiencing a number of strange anomalies and issues. In the past year SL users have had to deal with CopyBot, CampBots, SheepBots, grey goo, and alt instances.
Each of these are headaches unto themselves, and possible harbingers of more severe problems to come.
Virtual nuisances
CopyBot was originally created as a debugging tool by the SL development team and was intended for functions like import/export and backing up data. But as is so often the case with technology, it was twisted and used for an entirely different purpose altogether. Some opportunistic Second Lifers used CopyBot to duplicate items that were marked no copy by the creator or owner, thus violating intellectual property rights. To date, attempts to counter CopyBot have included anti-CopyBot spamming defeaters, which have in turn given rise to anti-anti-CopyBot defeaters. Call it an algorithmic arms race.
While this hints at post-scarcity and open source, it is still unclear how unbridled duplication will offer users the incentive to create original artifacts for the SL environment.
CampBots and SheepBots aren't nearly as contentious, but are equally annoying. These are essentially SpamBots working under the guise of an avatar.
And back in October of 2006 users experienced a grey goo scare when a "griefer" (a person who disrupts video-games) attacked Second Life with self-replicating "grey goo" that melted down the SL servers. The griefers used malign scripts that caused objects to spontaneously self-replicate. According to the the transcript of the SL blogs:
4:15pm PST: We are still in the process of investigating the grid-wide griefing attacks, as such we have momentarily disabled scripts and “money transfers to objects” as well on the entire grid. We apologize for this and thank for your patience. As soon as I have more information, I will pass it along.More recently SL users have had to compete with so-called alt instances who launch ultra-fast bots that scoop up valuable land; automated bots work with much greater efficiency than humans. Alt instances are additional avatars controlled by the same user. They do this to capitalize on on the First Land privileges that are extended to newbies. It is estimated that users have on average 1.25 avatars, indicating that there may be as many as 500,000 in-world alts.
4:35pm PST: As part of our effort to counter the recent grey goo attacks, we’re currently doing a rolling restart of the grid to help clean it out, this means each region will be restarted over the course of the next few hours. Thanks again for your patience.
4:55pm PST: There was a slight delay to our rolling restart while we continued our investigation. The rolling restart should begin soon, if you are currently in-world you will get a warning before your region is restarted - allowing you to teleport to another region. We hope to have logins open again very soon. Thanks again for everyone’s patience during this issue.
These bots have created a huge digital scarcity because Second Life has been overwhelmed with the groundswell of new residents. Users have asked that these bots be made illegal and Linden Labs has agreed to look into it.
Our analog, digital and future worlds
As I look at these examples I can't help but think that virtual reality environments are offering a glimpse into our future -- both in the analog and digital arenas. Second Life in particular is a mirror of not just our own society, but of future society itself. In real life we are dealing with the widespread copying of copyrighted material, issues of open source, out of control spam, the threat/promise of automation, molecular fabrication, and of course, the grim possibility of runaway nanotech.
Moreover, an uploaded society would conceivably face more problems in digital substrate than in the cozy confines of the analog world. We can't 'hack' into the code of the Universe (at least not yet). As a consequence our existence is still very much constrained by the laws of physics, access to resources, and the limits of our information systems (i.e. our accumulated body of knowledge). That said, we do a fairly decent job of soft-hacking into the Universe, which is very much the modus operandi of an intelligent species.
But the soft-hacking that we're doing is becoming more and more sophisticated -- something that could lead to over-complexity. We're creating far too many dangerous variables that require constant monitoring and control.
As for the digital realm, it is already complex by default. But like the analog world it too has constraints, though slightly different. Virtual worlds have to deal with limitations imposed by computational power, algorithmic technology and access to information. Aside from that, the sky's the limit. Such computational diversity could lead to complexity an order of magnitude above analog life.
Hackers and criminals would seek to infiltrate and exploit everything under the virtual sun, including conscious minds. Conscious agents would have to compete with automatons. Bots of unimaginable ilk would run rampant. There would be problems of swarming, self-replication and distributed attacks. And even more disturbingly, nothing would be truly secure and the very authenticity of existence would constantly be put into question.
Perhaps there are solutions to these problems, but I'm inclined to doubt it. Natural selection is unkind to overspecialized species. Further, we have no working model of evolution in digital substrate (aside from some primitive simulations).
This is one case where I certainly hope to be proven wrong.
"Such computational diversity could lead to complexity an order of magnitude above analog life." - and that's what I find exciting about the possibilities. As for Second Life's issues, frankly they've gotten a lot less frequent in the 7 or so months I've been playing. Early on there were times it was almost unusable. The defenses are so far keeping up with the offenses. Whether it stays that way remains to be seen.
ReplyDeleteFor an interesting, well thought out (if scary) perspective on such a transition, I recommend science fiction writer Charles Stross' "Accelerando". A definite must-read for a look at the darker side of singularity.
ReplyDeleteI think one of the great thing of this book is that he avoids the much-discussed "end of the world through grey goo", "thought control though hacking", etc. which have already been covered. He instead assumes we're able to deal with those issues and explores second-order outcomes.
Well worth the read.
E.
Robin Hanson has some interesting thoughts about this in an old essay If Uploads Come First. Well worth reading.
ReplyDeleteOne necessary (but not sufficient) constraint to make digital reality manageable is some some kind of "energy" or "cost". For example, one reason why unbounded replication is a threat in Second Life is that it doesn't cost anything. If it did, whatever the "funding" or "energy" source was would quickly run out in gray goo attacks, and they would stop. Similarly if fast bots cost their users, they would not be deployed so freely.
Ironically, these activities do have a cost -- the computational load on Second Life servers. But this cost is not accounted back to users. So perhaps there is a simple change that would help to make this problem more tractable, and at the same time, make virtual world providers more self-sustaining.
Copybots could be used by a future society to rapidly replicate entire virtual universes as replacements for dysfunctional ones so the problem contains its own solution. This tends to support uploading and also the world as simulation theory, not refute them. The complexity problem also provides an explanation for the Fermi Paradox, i.e. a simulated universe with only one planetary civilization poses less danger of reaching the limits of computational power.
ReplyDeleteThe forces of order always seem to win out over the forces of disruption. Hack attacks on the internet or on individual sites are always temporary nuisances, not permanent disasters; based on Erin's comment, this seems to be true of Second Life as well.
ReplyDeleteOne would expect this to be the case since there are always more (and probably better) brains with an interest in keeping the system running than there are with an interest in disrupting it. As "the system" in question comes to mean virtual worlds in which people are actually living and not just playing, this will become more and more true.
Uploaded life doesn't need to be 100% safe. It just needs to be substantially safer than biological life. Virtual environments can be hacked, but buildings in physical reality can be destroyed by hijacked airplanes or by earthquakes. In principle, uploaded beople could maintain digital backups of themselves and otherwise safeguard their survival; biological people who are killed by terrorists or accidents are, as far as we know, gone for good.
Unfortunately for uploaded minds, destruction is far from the worst case. If your uploaded self is stolen and hacked, multiple lives of slavery await (depending on how useful you are).
ReplyDeleteMore generally, the potential subversion of online identities (even current very schematic ones) gives intellectual property theft a whole new unpleasant dimension.
infidel753 - You literally took words out of my mouth :-)
ReplyDeleteBad guys existed forever..just as they get smarter, the good guys get smarter too..As long as the bad guys are the minority, i think we are fine. There might be incidents here and there just like what we see today but overall the world will go on!!
The effort put into securing computer systems, and therefore the effectiveness of that security, increases with the importance of what those systems are used for. I don't think any hacker has ever succeeded in penetrating the computers which control nuclear weapons, for eaxmple -- neither here nor in Russia.
ReplyDeleteHow important, and therefore thorough, computer security will become, when what is running on those computers is not merely our games and communications, but our very selves!
@Jed -- I find it hard to imagine the form of slavery you envisage actually taking place. Such actions, if detected, would presumably bring fearsome punishment. If someone wanted such slave-programs, surely he could create suitably-modified versions of himself, or even create simplified online personalities from scratch, with whatever characteristics he wanted, and avoid the potential consequences of aggression against others.
Some people with naturally-submissive tendencies might even voluntarily rent or sell versions of themselves for such purposes. There are endless possibilities.
Again, uploaded life does not need to be perfectly safe. It just needs to be substantially safer than organic life -- and remember, the mortality rate associated with the latter reaches 100% given enough time.
Let's follow out the implications of Infidel753's speculation:
ReplyDelete"If someone wanted such slave-programs, surely he could create suitably-modified versions of himself, or even create simplified online personalities from scratch, with whatever characteristics he wanted..."
But suppose that is possible -- then what is the moral status of such a program? It acts and to all appearances feels like a normal person, but I'm allowed to create, modify and destroy it.
And if that is permitted, what is so terrible about me just copying you? The result is just like one of my creations, but it happens to have your memories (until I start to edit them). I've just "sampled" you.
Also, what is an "original" worth, in that kind of world? Even if you can't copy me, I'm not sure why I have any more weight (in a legal or economic sense) than these easily created entities.
I also note that in spite of all sorts of penalties, copying music and movies, child porn, etc. just seems to grow. So I'm not optimistic about being able to control this kind of activity.
I am very much in favor of emerging technologies that have the potential to completely
ReplyDeleteremake the human experience and completely redefine what it means to be human; evolve,
evolve, evolve, is what I always say; however, when it comes to humans, there is always a
downside; there is always someone or some group that wants to rain on the collective parade;
we really don't need to go racing into the future with rose-colored glasses on, and this kind of
thing reminds us of that
we probably will, though
will we open a pandora's box with this virtual world, uploading thing -- one that is on an order of
magnitude beyond what we can presently conceive of?; it seems quite possible, which of
course, does not bode well for the future; maybe it is such that we can't just technologically
evolve; maybe we need a psychoemotional evolution and a spiritual evolution as well; maybe
high-tech nirvana isn't going to be everything that we want it to be
and yet, what can we do as individuals?; the collective focus is accelerating towards, and
increasingly concentrated in, emerging technologies that will move us into the Singularity;
seems like, in so many ways, we are just along for the ride
in a way, this situation reminds me of war; nobody wants war, you hear that everywhere all the
time; and yet, war is always here, there, and everywhere; we are just along for the ride, with
each of us left to deal with or cope with the situation as best we can; having been in a combat
zone, this seems ludicrous to me, to say the least, and really it seems quite insane; and yet, on
and on, the beat just goes on and on and on ...
now we are racing towards the scientific Singularity, with all kinds of warning signs, but are we
going to read the handwriting on the wall?; and even if we do, will we pull the plug before it is
too late?
despite the tone of my comments, this is not about doom and gloom; rather, this is about
reality, the reality we are presently involved with, and where we are going into the future; we
need to examine all sides of an issue, and find a middle path therein; one of the biggest
problems that I see with a self-determining future is the accelerating pace of things and the
power of the science and technology that is emerging; this isn't future shock, it is something that
could just be overwhelming; in my view, it is possible that there will come a time in the collective
history that our technology will simply get ahead of us, that we will lose a sufficient amount of
control over that technology, and when that happens, well, it is anybody's guess as to what will
happen then
John C. Wright's Golden Age trilogy dealt with part of the problem you bring up, including the legal issues of multiple versions of people coexisting.
ReplyDeleteOh, but there is a working model of evolution in digital substrate - a prototype of an artificial mind building emergent complexity from the bottom up (http://www.otoom.net).
ReplyDeleteIt operates through one rule set, ie, the forming of affinitive domains based on attractor-type behaviour of its functional elements.
Even if scaled up properly, it will not prevent malicious attacks, just like a human mind cannot be completely isolated from ‘bad’ ideas. Nevertheless, any newcomer, whether welcome or not, becomes merely part of the overall whole. A ‘taking over’ is impossible.
mw.
Jed: "But suppose that is possible -- then what is the moral status of such a program? It acts and to all appearances feels like a normal person, but I'm allowed to create, modify and destroy it."
ReplyDeleteIt's going to be very interesting to see how ethics evolves in the face of these kinds of questions. Innovations such as surrogate motherhood, the ability to keep people "alive" indefinitely in vegetative states, and recognition of the near-human traits of the great apes have forced us to confront ethical issues that never existed before. Technological innovation will continue to force ethical innovation.
One standard which I can imagine would be to forbid creation (by whatever means) of slave programs which are actually self-aware. If a slave program were designed to appear self-aware, but actually was not and therefore did not suffer or have any subjective experience at all, then I see no ethical problem (compare with the ethical difference between acting out a rape fantasy with an ultra-realistic sex doll and actually raping somebody).
And yes, enforcement of whatever standards we settle on will never be 100% efficient. We will do the best we can to prevent abusive behavior and consequent suffering, but we will never be able to completely eliminate those things. This has always been true and probably always will be -- with or without uploading and virtual reality.
Here's a scenario that demonstrates we aren't coming to grips with the real issues:
ReplyDeleteYou check your machine (maybe a server) and discover a few hours ago a virus got in, and now your machine is infested with a population of several hundred sentient beings -- maybe making music, putting on plays, or writing poetry, whatever will convince you that they are "really sentient".
Of course the virus did this on purpose, to make it harder for you to clean out your machine. Maybe your little city makes its living in the spam industry -- but they promise to find a more honest line of work. Do you have the heart to kill all these budding artists? Do you have the legal right, if they are sentient?
Now consider the situation of a sysadmin who has to kill entire towns of sentient beings once or twice a week. Of course the viruses get better at tugging his heartstrings over time. On the other hand anyone who give in to the requests of the viral sentients is going to be overwhelmed in short order.
What is going to happen to our moral intuitions about the value of sentient life in a situation like this?