May 30, 2006

IEET HETHR Photostream

Here's a photostream of photos I took at the IEET's recent Human Enhancement Technologies and Human Rights conference. You can check out my Flickr account here.

Some samples:





Back from Stanford

I got back from Stanford late yesterday. I attended the IEET's HETHR conference, which was a spectacular success. By the end of the 3-day event my mind was completely mush -- a bona fide case of data input overload. And a 25 hour long day on Friday didn't help either.

I'll be blogging, podcasting and photostreaming the event, so look for that in the coming days.

May 19, 2006

Death and the brain

Recent voluntary euthanasia hullabaloos such as the Terry Schiavo case have revealed a public that’s largely divided and somewhat confused as to what death is and when it should actually be declared. This issue is set to get increased attention as a) more people vie for increased control over their right to die, b) our medical sensibilities migrate increasingly toward a neurological understanding of what it means to be ‘alive’ in a meaningful sense, and c) the realization that the potential for cryonics and other advanced neural rescue operations will give rise to an information theoretic interpretation as to when death should truly be declared.

It is customary to declare death when the heart stops beating. This only makes sense; in the past it was the practical thing to do given limited medical know how. Moreover, without ever having the phenomenon of an individual in a coma or on life support, there was no need to have an alternative conception as to when it was appropriate to declare death.

Modern technologies have twisted this practicality by introducing devices that keep people biologically alive while their brains have essentially stopped working. Individuals ‘living’ in these persistent vegetative states are a particularly macabre sight: corpses that are hooked up to machines so that the blood can keep flowing.

Of course, most people don’t quite see it that way. Many cling to the notion that life can’t possibly come to an end until the heart stops, or that some kind of ensoulment or dignity is maintained until all biological functioning ceases. In many cases, despite irrevocable damage to the brain (the word ‘persistent’ is used for good reason), many hold out for some sort of divine intervention or miraculous medical intercession.

These perspectives fail to take into account personhood-based conceptions of what it truly means to be alive. Unless one is capable of ongoing subjective engagement with the world, one is not a person. Conscious existence is a necessary condition for agency, and because the brain generates consciousness, it’s fair and reasonable to suggest that we are our brains. When a brain is damaged or malfunctioning beyond repair, then the person who owns that brain should really be considered deceased.

If, on the other hand, a cognitively impaired condition is not permanent (e.g. someone in a temporary coma) and the integrity of higher brain functioning has been maintained, a person should not be declared dead. There is still cause for hope because the fruits that can give rise to a conscious person are still intact.

This is what is referred to as an information theoretic interpretation of life and death. Functionalist and materialist conceptions of brain activity suggest that conscious individuals are the sum of their brain patterns. These patterns are expressed by tangible and measurable parts of the brain. Consequently, the ‘information’ that’s encoded in the brain and in constant flux is the person. As long as the information in the brain remains intact (i.e. the software), and provided the brain is functioning properly so that information can be computed and expressed (i.e. the hardware), a person should be considered alive – or in some cases, alive in the sense that consciousness and active personhood can be resumed at a later time.

Information theoretic death has deep implications for those interested in cryonics or the potential for mind uploading. Given the possibility for cryonic reanimation or a transferred existence into a different computational substrate, an alternative legal and medical definition of death would seem to be in order.

The argument goes as follows: If biological functioning stops for an individual, but the brain’s information can be perfectly restored and recovered at a later date, a person should not be considered permanently dead. Death, therefore, should only be declared when the information state of the brain is so disorganized that recovery is impossible.

Currently, when considering the status of an individual preserved in cryonic storage, it is not known beyond a shadow of a doubt that reanimation is impossible. In fact, with each passing year the strength of the hypothesis favouring such a possibility strengthens. Practically speaking then, cryonauts are impermanently dead as they have a non-zero potential for resuming life in the future.

Consequently, I’m in favour of an immediate change to our sensibilities as they pertain to the declaration of permanent death -- but I realize that most of society isn’t quite ready for this just yet. It will take some time before people start to acknowledge the importance of preserving the brain’s informational essence beyond biological death.

The day will come, however, when information theoretic death will become the established and legal standard for declaring a person’s permanent death. At such a time it will be information theorists rather than medical doctors who will be the ones declaring when life has truly ended.

Tags: , , , , , , , mind transfer, , , .

Official opening of uvvy island in Second Life: Transhumanist day

[via Giulio Prisco]

Official opening of uvvy island in Second Life: Wednesday June 7, 2006, 2 pm EST (8 pm in most European countries, this should be convenient for visitors from both the US and Europe).

The opening event will be a "Transhumanist Day", with presentations:

"Transhumanism: The Most Dangerous Idea?", by James Hughes (James Sleeper in SL).

"Transhumanist Technologies on the Horizon and Beyond", by Giulio Prisco (Giulio Perhaps in SL).

"Virtually Real Virtuality", by Philippe Van Nedervelde (Philippe Golding in SL).

See:
http://uvvy.com/index.php/Official_opening_of_uvvy_island_in_SL

May 15, 2006

Skype now free for phone calls in US and Canada

If the giant telecoms weren't shaking in their boots about Skype prior to today, they certainly are now:

Skype launches free call promotion
Skype, the Web telephone company, said on Monday it would allow consumers in the United States and Canada to make free phone calls, a promotional move that marks a new blow to conventional voice calling services.

The offer, which extends through the end of 2006, covers calls from computers or a new category of Internet-connected phones running Skype software making calls to traditional landline or mobile phones within the United States and Canada.

Previously, users of Skype, a unit of online auctioneer eBay Inc. (EBAY.O: Quote, Profile, Research), were required to pay for calls from their PCs to traditional telephones in both countries. Calls from North America to phones in other countries will incur charges.

Skype already offers free calling to users worldwide who call from computer to computer.

May 14, 2006

Bloggers at the Singularity Summit

Thankfully, for those of us who were unable to attend the Singularity Summit at Stanford this past Saturday, several bloggers chronicled the event:

Mike Treder from CRN
Renee Blodgett from Down the Avenue
Dan Farber from ZDNET

May 13, 2006

Abstract for my Stanford talk

I've completed the first draft of the talk I'm going to give at the IEET's Human Enhancement Technologies and Human Rights conference at Stanford in a couple of weeks. The topic for my panel is "From Human Rights to the Rights of Persons," and I'll be speaking on Saturday May 27 at 4:30 alongside Jeff Medina and Martine Rothblatt.

The title of my presentation is, "All Together Now: Developmental and ethical considerations for biologically uplifting non-human animals." Here's the abstract in its current form:
As the potential for enhancement technologies migrates from the theoretical to the practical, a difficult and important decision will be imposed upon human civilization, namely the issue as to whether or not we are morally obligated to biologically enhance non-human animals and bring them along with us into advanced cybernetic and postbiological existence. There will be no middle road that can be taken; we will either have to leave animals in their current evolved state or bring as many sentient creatures along with ourselves into an advanced post-Darwinian mode of being. A strong case can be made that life and civilizations on Earth have already been following this general tendency and that animal uplift will be a logical and inexorable developmental stage along this continuum of progress. But tendency does not imply right; more properly, given the potential expanse of legal personhood status to other sentient species, it will follow that what is good and desirable for Homo sapiens will also be good and desirable for other sapient species. If it can be shown that enhancement and postbiological existence is good and desirable for humans, and conversely that ongoing existence in a Darwinian state of nature is inherently undesirable, then we can assume that we have both the moral imperative and assumed consent to uplift non-human animals.

May 12, 2006

Audiocast of Mark Walker's talk on superlongevity

The latest Sentient Developments audiocast is now available; it is the audio broadcast of Mark Walker's talk on superlongevity held at the University of Toronto on April 20, 2006.



Tags: , , , , .

Lots of transhumanism in the news

There's been a bunch of transhumanist and futurist related items in the news lately:

Smarter than thou?: Stanford conference ponders a brave new world with machines more powerful than their creators (SF Gate)

The ideas interview: Nick Bostrom: John Sutherland meets a transhumanist who wrestles with the ethics of technologically enhanced human beings (Guardian)

Your Upgrade Is Ready: Evolution has done its best, but there's a limit to how many plug-and-play neural implants, supercharged blood cells, strong-as-steel bone replacements and mind-controlled PCs you can expect from randomly colliding natural forces. Wanna be Superman? Better call the engineers. (Popular Mechanics)

The new incredibles
: Enhanced humans: People with enhanced senses, superhuman bodies and sharpened minds are already walking among us. Are you ready for your upgrade? They're here and walking among us: people with technologically enhanced senses, superhuman bodies and artificially sharpened minds. The first humans to reach a happy, healthy 150th birthday may already have been born. And that's just the start of it. Are you ready for your upgrade, asks Graham Lawton. (New Scientist)

Nanotech Policy Faces No Small Hurdles: Air purifiers, cosmetics, sports equipment, computers, clothing, bedding, household appliances, medical devices. Nearly every item of daily life could be made — and made better, say supporters — with nanotechnology. (Fox News)

Trends hint at a golden era of nanotechnology: Innovations like robotic blood cells portend a “golden era” of nanotechnology, by Ray Kurzweil. (Science & Theology News)

May 11, 2006

Miraculous memory for mere mortals

If you're looking to significantly augment your memory skills, but don't have the patience to wait for a cybernetic memory implant, mnemonic techniques may be the answer.

By using special memorization techniques, perfect memory (also known as photographic or eidetic memory) is not required to recall amazingly long strings of information -- which at any rate is a freakishly rare (and from an evolutionary standpoint, possibly maladaptive) cognitive condition. Moreover, memorization grandmasters themselves admit that powerful mnemonics are the keys to their success.

And we're not just talking about using Roy G. Biv to memorize the colour spectrum. With the right techniques, combined with some time and practice, the human brain au natural is capable of some astounding feats of memory.

S. V. Shereshevski, for example -- a guy who claims to have no special cognitive abilities -- is able to memorize the first 31,811 digits of the mathematical constant pi using mnemonics. Others claim that regular schmucks like you and me can -- with about 250 hours of practice -- use mnemonics to go from memorizing 7 digits to more than 80.

How is this possible?

Memory is a funky thing. Aside from thinking awfully hard and closing our eyes real tight, what are we really doing when we try to pull a piece of data out from our memory banks? Much of our capacity for memory is latent; it's a fairly subconscious and passive activity.

Part of it is in accessing virtual lists that we've created in our minds. Data is loosely categorized and stored in "chunks" for somewhat on-demand, but often imperfect, retrieval. Over time trivial information tends to lose its linkage to the conscious realm. It's still there in our brains, we've just forgotten that those memories are there and we don't make attempts to recall them. By contrast, more vivid and important memories are rarely forgotten. There's probably a very good evolutionary reason for this.

Another aspect to memory is how the visual cortex is involved. Memories have increased durability and are more readily fixed in one's mind when they are given a visual association. Consequently, memory techniques are often designed to take advantage of the interplay between the visual cortex and memory retrieval. For example, you will have a better chance at remembering a long number if you associate a picture with that number than if you simply try to remember the numerical string on its own.

In conjunction with this, teachers and memory experts suggest using what is called active recall. This is the learning practice in which memories are stimulated during the learning process itself. Using this technique, students are encouraged to overtly express the information learned (e.g. answering a question) rather than just passively absorbing the information (e.g. reading).

For more intense memorization tasks -- like the ones confronting the competitors at the World Memory Championships -- mnemonic peg systems are utilized. A peg list is a list of words that are pre-memorized and are easily associated with a number or object. To rapidly memorize a list of arbitrary objects, each object is associated with the appropriate peg. The neat thing about this system is that a peg list only has to be memorized once and can be re-used any time a list of items need to be memorized.

Here's a rhyming example of a peg list (from Wikipedia):

* 1-gun ----->Visualize the first item being fired from a gun
* 2-shoe----->Visualize an association between the second thing and shoes
* 3-tree----->Visualize the third item growing from a tree
* 4-door----->Visualize the 4th item associated with a door
* 5-hive----->Visualize the fifth item associated with a hive or with bees
* 6-bricks--->Visualize the sixth item associated with bricks
* 7-heaven--->Visualize the seventh item associated with heaven
* 8-plate---->Visualize the 8th item on a plate as if it is food
* 9-line----->Visualize yourself fishing with the 9th item on your line
* 10-hen----->Visualize the 10th item associated with a chicken.

Once you've memorized each object with its associated number, you can use it to remember the following grocery list of 10 items:

* milk-->Picture a stream of milk being fired from a gun
* eggs-->Picture an egg wearing shoes
* butter-> Picture sticks of butter growing from a tree
* bread--> Picture a door made from bread
* Catsup-> Picture bees flying from a catsup bottle
* Beer---> Picture a brick house with beer cans where the bricks should be
* Toilet paper--> Imagine A roll of TP with angel wings and a halo
* Soap----->Picture a bar of soap on a plate- yum
* Razor Blades-->Picture yourself reeling in a razor blade as if it's a fish
* Batteries--->Picture a mechanical hen that runs on batteries

The memory grandmasters take peg lists like these to the next level, pre-memorizing custom peg lists that contain hundreds and even thousands of associations. It enables better visual associations and even allows for cognitive data compression techniques (it's like Winzip for the brain). Using these techniques, a shuffled deck of 52 cards can be memorized by experts in just under a minute. Other tasks at memory competitions include memorizing poems, names and faces, binary digits, dates, random words, and so on.

As remarkable as these techniques are, however, the simple truth of the matter is that human memory on its own is rather unremarkable without these specialized mnemonics. It takes considerable dedication and patience to attain the level of skill used by these competitors.

And while today we use a number of external prosthetic memory devices to help us, like palm pilots and computers, they are clunky, tedious and time consuming. Cybernetic prosthetic devices, on the other hand, will be a welcome and elegant advance to human cognition. Unlike the psychological condition of eidetic memory – an ability that can't be turned off and subsequently causes great torment for persons who have it – a cybernetic device could be activated only when the memory needs to be recalled.

It's hard to predict how memory implants will change human thinking and even consciousness itself, but it's fair to say that it likely won't be subtle.

For more mnemonic techniques, go here.

Tags: , , , , , , .

May 9, 2006

Art Caplan and a 'less fearful future'

"The genetic genie is out of the bottle. There is not much anyone can do to put it back nor, once we understand its potential for good, ought we to do so. This genie will, however, do the bidding of those who control it. To enjoy the benefits genetics offers, it will be up to you and me and our children to build a politics, media, marketplace and educational system strong enough to show the genie who is the boss. We must all - scientist and nonscientist alike - play god when it comes to genetics." -- Arthur Caplan, Science Anxiety: Toward a less fearful future, published in the Philly Inquirer

May 8, 2006

Ethical eats

During the past several years I've paid increasing attention to my food choices in an effort to eat more healthily. Guided mostly by the sage advice of Dr. Andrew Weil, I've become very careful about the foods I put into my body. But lately I've been just as concerned with my food choices as they pertain to ethical considerations as I have been with health issues.

For example, while I am a vegetarian, I do like to eat eggs (preferably those that are Omega 3 enriched), but I've always been uncomfortable with my choice knowing that eggs tend to come from factory farms. Recently, in a 'duh, open-hand-smack to the forehead,' moment, I realized that I have the option of buying eggs laid by free-range chickens. Clearly, my ethical eating habits have considerable room for improvement, but I'm getting there, and the options enabling me to eat more ethically are increasing all the time.

When I think about my vegetarianism and my food purchasing habits I focus on three distinct motivators: health, animal ethics, and the environment. Lately, however, the latter two considerations have been giving me great cause for concern, obliging me to re-evaluate my eating habits even further.

Consequently, it was with great interest that I caught Dale Carrico's recent blog entry about a new book written by Peter Singer and Jim Mason. In the book, which is titled, The Way We Eat: Why Our Food Choices Matter, Singer and Mason discuss such things as corporate deception, widespread waste, and desensitization to inhumane practices in the context of ethical food choices. The book's main thrust is that "America's food industry seeks to keep Americans in the dark about the ethical components of their food choices." Needless to say, this is hardly an exclusively American phenomenon.

What the authors are suggesting is that most people are either oblivious, ambivalent, or in complete denial about the negative effects of their eating habits – and the food industry is largely responsible for both creating and perpetuating these feelings of disconnect.

Yet, Singer and Mason are not just decrying the state of the food industry, they are also trying to come up with solutions. As noted in a recent Salon article about the book, some of these solutions are in tune with my own hopes for the future, including the growing of meat from brainless animals. But they also have some more radical ideas, including the neuroengineering of animals to alter their instinctual tendencies.

Like Carrico, I am unsure about this last idea. The suggestion that the psychologies of farm animals be adjusted to reduce their subjective sense of suffering is off putting, mostly because it would do nothing to improve our relationship with animals, nor would it result in more humane farming practices. While I can understand why Singer and Mason would push for such an 'improvement' (it's a real and hard fix, after all), and while I run the risk of posing a slippery slope argument, I think such a strategy could open a Pandora's Box of potential problems that could extend outward to other non-human animals and even humans themselves. Moreover, such a strategy would do nothing to alleviate the negative environmental impacts of factory farming.

There is one individual, however, who is pushing for long-term sustainable farming and an increase in ethical food options. He is an evangelical Virginia farmer named Joel Salatin who believes that a "revolution against industrial agriculture is just down the road." Similar to the hope that Big Energy will dissipate into smaller, localized and sustainable versions, Salatin envisions the end of mega-farms and supermarkets.

He argues on behalf of localized farming and refuses to ship anything out from his farm. He believes in 'relationship marketing,' and contends that eye-to-eye contact is necessary when a transaction is conducted between a farmer and a consumer. As quoted in Michael Pollan's new book, The Omnivore's Dilemma: A Natural History of Four Meals, he says, "Don’t you find it odd that people will put more work into choosing their mechanic or house contractor than they will into choosing the person who grows their food?" What Salatin believes is happening is the rise of alternative agriculture, with the mainstream splitting into smaller and smaller groups of like-minded people.

Pollan himself believes that there is more to food than just price and convenience; rather, "our relationship to food constitutes our most profound engagement with the natural world," he says. He sees the trip to the supermarket as a crime against the environment and oneself. As a recent Utne article about Pollan noted, when people start to become personally invested in what they eat they enter "a kind of landscape and kind of community."

When I think about these various perspectives and reflect on my own choices I necessarily have to weigh the options. Being an informed consumer and someone who legitimately cares about animals and the environment is something that informs a big part of my decision making. But equally, I have to watch my budget and not inconvenience myself beyond what I deem to be acceptable.

As the omnivorous Dalai Lama himself admits, most of us cannot live perfectly ethical lives, but we should strive to do the best we can and mete out as little harm as possible.

Tags: , , , .

May 1, 2006

Review of Mamoru Oshii's Avalon (2001)

[Warning: this review contains lots of spoilers, but please, don't let that stop you.]

Back in 2001, Japanese anime director Mamoru Oshii went to Poland to direct his first live action movie, Avalon. Oshii, who is primarily known for his work in anime, is the director of the acclaimed 90's science fiction classic, Ghost in the Shell (1995) and the sequel, Ghost in the Shell II: Innocence (2004).

Now, before I get too deep into this review, I just want to make it clear that from a production perspective Avalon is a borderline B movie. The film was created on a limited budget and lacks the polished look and crisp dialogue of most mainstream films, particularly those that come out of Hollywood. The look and feel of Avalon is similar to a made-for-TV movie, although many of the special effects are of a superior quality. Also, the movie was a Japanese-Polish collaboration, which, given its Orwellian setting, gives it a unique authenticity.

That being said, like Ghost in the Shell, Avalon offers considerable food for thought.

Avalon is set in the near future, a world in which fully immersive virtual reality gaming is all the rage. Many players are able to earn a living through gaming, particularly in an ultra-violent war game called 'Avalon'. The game, however, is highly addictive and dangerous, causing some less skilled players to suffer from severe and permanent brain damage. Those who meet this fate are referred to as the 'unreturned.' Because of the risks, the game is outlawed by the state and made illegal.

The main character, Ash, lives a boring, lonely and mundane life in the real world. Her life in the simulation, by contrast, is filled with action, danger, and fulfillment. Where her real life is repetitive, pointless and safe, her virtual life is dynamic, challenging and frightening. Avalon is a world in which people come to value their virtual lives over their real ones. The game-master himself is looked upon in near reverential terms and is unmistakably made to resemble a priest.

In the game itself, Ash is an elite player driven to complete level after level. Her motivation for doing so is not entirely clear, even to her, save for the hope that something greater awaits her in the 'next level.' Eventually, after the introduction of a rival player and the discovery of certain clues, Ash succeeds at finding an elusive hidden level.

As Ash transitions to the mysterious level, the sepia tones that dominated the film's appearance are washed away to reveal a full colour panorama. The sensation is much like in The Wizard of Oz when Dorothy opens the door in her black and white world and enters a new world filled with dazzling colour. Where Dorothy enters a world of fantasy and whimsy, Ash finds herself undergoing an existential paradigm shift.

Ash's previous reality, one that was draped in drab monotone, is immediately understood to represent a partially realized existence. The overwhelming sepia and lack of colour come to symbolize limitations to human and social capacity – limits of perception, cognition and sensation. Alternately, the new world, with all its visual richness and dynamism, opens up an entirely new set of possibilities for Ash.

Oshii is making a number of interesting analogies here, including a comparison of the past with the future. But more aptly, he is comparing life as it was in communist Europe to modern life in capitalist democracies. It is no accident that Oshii made this movie in Poland using native actors and Polish dialogue.

Prior to her entry into the final level, Ash's world was filled with Orwellian imagery. Bleak and eerie tones dominated the screen along with retro-futurist gadgetry reminiscent of such films as 1984, Brazil and Gattaca. Computer terminals were limited to 3D text displays. Books containing no text whatsoever filled shelves. People gathered at communal soup kitchens to eat gruel. Public transportation was dominated by shaky old-fashioned streetcars. The new world, by contrast – Ash's mysterious new level - was filled with billboards, streets filled with bustling people, modern cars and a rich cultural life (including a vivid performance by an actual orchestra).

The new level, however, also presented its challenges. Ash was instructed to hunt down a 'computer bug' -- a former player who escaped to the advanced level. The 'bug' turns out to be a former teammate of Ash's, Murphy, and when she finally confronts him he declares, “Reality is what we tell ourselves it is!” As instructed, she shoots the defector and transcends to yet another level -- another mode of existence -- but this time to the mythical Avalon itself.

In Oshii's Avalon, the comparison of a totalitarian system with that of a modern democratic one can be understood as a transition of simulations in the postmodernist sense. It is reminiscent of Baudrillard's cultural simulation as constructed by the media and the state itself.

Similarly, according to modal reality theory, any system that appears rational to an observer will be construed as real; what we as observers don't know, however, is what possibilities might lie beyond our own modal reality. In the Avalon war game, by advancing to the final level, Ash is given an opportunity to transcend her original modal reality and enter into one with entirely new possibilities.

The relevancy to transhumanism is also pertinant. Given the potential human augmentation, and given the possibility of our own existential mode shift (i.e. the Singularity and virtual existences), humanity might find itself much like Dorothy and Ash, exiting a world of black and white and entering a world of undiscovered colour.

Ultimately, Avalon is a treatise on how we choose to perceive life and how perception is imposed on us by life itself. The movie raises questions about the authenticity of existence in any type of world and how environmental, social and biological restraint/constraint impacts on a person's sense of their subjective life.

And as always, Mamomru Oshii has tapped directly into the zeitgeist of our times.

Tags: , , , , , , , .

Odds 'n sods

Whoa, Betterhumans got hacked last night! Aside from a few panicked moments, no harm was done. Here's a screenshot of what the site looked like for a while.

Rushkoff's latest equation: faith = illness. Reminds me of something I once wrote.

Michael Anissimov recently posted a pair of excellent articles on his blog: Our Low-Entropy Universe and You and The Boötes Void (you know, The Boötes Void -- that fun part of the universe that's 250 million light-years across and completely devoid of anything). If you think that's hard to grasp, try perceiving infinity.

So now that we know that universal constants aren't really constants, should we probably think about giving the term a new name?

A US senator is supporting Kurzweil's and Joy's call for 'Manhattan Project' for bioterror and looming pandemics.

You can now use bioelectric pulses from an electric fish to help you relax.