November 28, 2011

Sentient Developments Podcast for the week of November 28, 2011

Sentient Developments Podcast for the week of November 28, 2011.

Topics discussed in this week's episode include the current Adderall shortage in the United States and its rampant off-label use (including a discussion of its nootropic qualities), the question of whether or not SETI is scientific (in response to a recent Rationally Speaking Podcast episode), how to avoid an asteroid impact (reflecting on Phil Plait's recent TED talk), observing the lost tribes of the Amazon (in consideration of Scott Wallace's new book), and remembering biologist Lynn Margulis.

Tracks used in this episode:
  • Opeth: "Heritage"
  • Cynic: "Amidst the Coals"
  • Cynic: "Hieroglyph"
  • Steven Wilson: "Grace for Drowning"
Podcast Feed
Subscribe via iTunes

November 27, 2011

The 'Flash Rob' meme spreads

Bill Wasik of Wired reports on a disturbing new trend: Flash robs. And yes, it's exactly what it sounds like.
Many different types of crowd disturbance have bubbled up during 2011, but perhaps the oddest category has been the “flash mob robbery,” or “flash rob.”

It’s a fad that started in Washington, D.C. back in April, when around 20 people filed into a high-end jeans store in Dupont Circle and quickly made off with $20,000 in stock. Since then, the practice has spread — Dallas, Las Vegas, Ottawa, and Upper Darby, Pa. have all reported incidents since then — though the targets have gotten a bit more downscale, with most of the thefts taking place in convenience stores.

The latest crowd theft took place Saturday night at a 7-Eleven in Silver Spring, Md., and it fit the familiar pattern. Kids pour into the store, calmly help themselves to merchandise, and then stream out again:

November 26, 2011

AlterNet: Corporations Are Patenting Human Genes and Tissues -- Here's Why That's Terrifying

A medical ethicist explains the dark implications of corporate medical patents and the nightmarish scenario of our medical-industrial complex.

Wired: The Rise and Fall of Bitcoin

Must-read article in Wired about the rise and ball of bitcoin:
When Nakamoto’s paper came out in 2008, trust in the ability of governments and banks to manage the economy and the money supply was at its nadir. The US government was throwing dollars at Wall Street and the Detroit car companies. The Federal Reserve was introducing “quantitative easing,” essentially printing money in order to stimulate the economy. The price of gold was rising. Bitcoin required no faith in the politicians or financiers who had wrecked the economy—just in Nakamoto’s elegant algorithms. Not only did bitcoin’s public ledger seem to protect against fraud, but the predetermined release of the digital currency kept the bitcoin money supply growing at a predictable rate, immune to printing-press-happy central bankers and Weimar Republic-style hyperinflation.

Nakamoto himself mined the first 50 bitcoins—which came to be called the genesis block—on January 3, 2009. For a year or so, his creation remained the province of a tiny group of early adopters. But slowly, word of bitcoin spread beyond the insular world of cryptography. It has won accolades from some of digital currency’s greatest minds. Wei Dai, inventor of b-money, calls it “very significant”; Nick Szabo, who created bit gold, hails bitcoin as “a great contribution to the world”; and Hal Finney, the eminent cryptographer behind RPOW, says it’s “potentially world-changing.” The Electronic Frontier Foundation, an advocate for digital privacy, eventually started accepting donations in the alternative currency.
Entire article.

CrossFit and The Clock

As most CrossFitters know, CrossFit wouldn't be CrossFit without The Clock. It's what gives each workout the sense of urgency that it deserves, an aspect that's largely missing from other fitness regimens.

Take bodybuilding culture for example. Guys will do a set of reps, walk around for a bit, admire themselves in the mirror and then proceed to do their next set of isolation movements.

There's clearly something missing from this approach; there's not much being done to address the crucial fitness domains of stamina and cardiovascular/respiratory endurance. CrossFit, on the other hand, addresses this particular facet by having The Clock.

Harder, faster

By introducing the element of time domains, athletes are compelled to work harder and faster in order to complete the workouts as quickly as possible. Not only does this add a competitive element to each workout, it also provides a way for each individual to measure their own success and improvement over time. And just as importantly, regular efforts to reduce personal time domains helps to improve both stamina and cardiovascular endurance.

This can be somewhat of a shock to those new to CrossFit. The idea of doing sets of Olympic weightlift movements while on the clock is one of the most intimidating aspects of CrossFit -- but it's also what sets it apart from other fitness methodologies. It's one of the key reasons why it works.

Time domains

There are many ways in which The Clock can be utilized in CrossFit. Most workouts are 'for time' meaning that all the sets and rounds have to completed as quickly as possible. A particularly effective and valuable time schema is the Tabata workout in which participants work as hard as they can for twenty second intervals, typically followed by ten seconds of rest. Another technique is to have athletes do as many rounds as possible within specific time domains, some as short as a minute.

This can be extremely motivating, only because failing to hit the time targets can sometimes result in a longer and more arduous workout. I can remember a WOD in which we were required to do six box jumps (24") followed by squat-clean-to-thrusters (95lbs). The WOD was finished only when 65 squat-clean-to-thrusters were completed.

What made this WOD particularly deadly was that the box jumps started on the minute every minute. Failure to get a good quantity thrusters in meant that the WOD kept dragging on and you risked finding yourself constantly stuck in front of your box. This was one workout in which the clock had a profound impact on the nature of the workout and the level of intensity that had to be brought to it.

Indeed, most CrossFitters have a love/hate relationship with The Clock. There are times, say for a twenty minute workout, when you've been working your ass off for what you think is a decent span of time, you look at the clock and realize only five minutes have transpired. It's easy to get demoralized at times like that, but hey, that's CrossFit; time to get your inner game in order and push yourself through.

Track your progress

Another consequence of The Clock is that it's often hard to avoid comparing yourself to others—and this isn't always a bad thing—it can certainly help in placing your own performance and level of fitness in context.

But one thing I've learned is that, while it's important to look at other people's time in relation to your own, it's more important that you compare yourself to yourself. Otherwise, you have no sense of progress. Rather than obsess over your time in relation to others, it's a better idea to focus on competing against your previous efforts.

So, all ready to set a new PR?

3...2....1...Go!

Guardian: Is the end of the world really nigh?

Science is moving ever closer to understanding how, and when, humanity may be extinguished.

November 25, 2011

Rationally Speaking asks: Is SETI scientific?

I really enjoyed this excellent episode from the Rationally Speaking Podcast about the search for extraterrestrial intelligence (SETI) and whether or not it should be consider a true scientific endeavor:
Is the search for extraterrestrial intelligence, or SETI, solid science, pseudoscience, or something else, as Massimo [Pigliucci] argues in his book "Nonsense on Stilts"? What are the theoretical foundations and empirical evidence that justify a multi-decade research program, and what are its chances of succeeding? Have we learned anything thanks to SETI? Also, if the universe is infinite, what problems does this pose for utilitarian ethics?

November 24, 2011

Biologist Lynn Margulis, 1935-2011

Biologist Lynn Margulis has passed away. Margulis was a key figure in the development of symbiogenesis theory and scientific gaianism. I was fortunate enough to meet Margulis and attend one of her talks back in 2006 which I reviewed right here on Sentient Developments. I'm republishing the review now to commemorate her exceptional life and important body of work.

Lynn Margulis's talk at UWO
Originally published on Sentient Developments, March 26, 2006.

This past Saturday I attended a talk by evolutionary biologist Lynn Margulis at the University of Western Ontario (my alma mater).

Margulis is known for her work developing symbiogenesis theory -- the idea that organisms come about primarily through the merger of individual and separate organisms.

The talk was attended by about 150 people, mostly profs and grad students. Margulis arrived a little late and looked a bit frazzled from her hectic schedule. She was a bit hoarse and under the weather, but was openly pleased to see a standing room only audience on a rainy Saturday afternoon.

Her presentation was done primarily through PowerPoint, and a number of her videos were accompanied by music; you could tell that some in the audience felt her presentation to be a tad on the "pop-science" side. It was certainly not technical enough for this particular audience, and meant more for undergrads (which was fine by me because I was able to follow most of it). Margulis was also guilty of incessant name dropping, a habit that grew quite tiresome after some time. Some people took early opportunites to leave -- individuals who were probably hoping for something more advanced and informative.

That being said, her presentation did result in some ooohs and aaaahs from the audience, including videos of photosynthetic worms and an equisitely camouflaged octopus.

Margulis, who was significantly influenced by 20th century Russian biologists like Konstantin Mereschkowsky, fleshed out her theory in her 1981 work, In Acquiring Genomes: A Theory of the Origins of Species. In this paper, Margulis argued that symbiogenesis is a primary force in evolution. According to her theory, acquisition and accumulation of random mutations are not sufficient to explain how inherited variations occur. Instead, new organelles, bodies, organs, and species arise from symbiogenesis.

Margulis is also a pioneer in gaia theory. Along with James Lovelock, Margulis has helped to popularize the concept and give it its modern form. Symbiogenesis theory clearly works well within gaianism, as it stresses the need to look at interactions of populations of organisms at given periods of time. During her talk, Margulis stressed the fact that individuals don't evolve but populations do. Her only qualms with Darwinism was that she believes the diversity of life arose not through competition but through organisms networking with each other.

Recent work on the human genome project has certainly added credence to Margulis's claim. During the talk, Dr. Shiva Singh noted that upwards of 41% of the human genome is comprised of viral DNA. Margulis also noted that the human body is not one singular organism. Rather, like the Earth's ecosystem, the human body is a community of life. We have bacteria in our gut and critters on our skin. Without them, we couldn't survive. She noted the case of one individual who lacked the ability to maintain such a balance, and it cost extreme sums of money to keep the person alive before he eventually died.

During her career Margulis has had to consistently defend her ideas against the established brands of evolutionary biology, particularly the likes of Richard Dawkins and other neo-Darwinists. Margulis has also had to work particularly hard as a woman in a field largely dominated by men. She noted how at one time a physicist snidely remarked that her theory of symbiogenesis was something to be expected from a female biologist who would naturally accept processes of co-operation rather than competition.

But during her talk, Margulis dismissed the efficacy of using such terms as co-operation and competition when describing the processes of evolution. "They belong in an economics class or on the basketball court," she said. The actual processes at work, she argued, are far too complex to reduce to such simple "cultural" phrases.

Disappointingly, Margulis's argument was quite weak in regards to explaining the actual mechanisms behind symbiogenesis and the encoding of such information at the genetic level. Nor did she offer much in the way of explaining how these relationships arise amongst populations of organisms so that they become common traits of particular species.

But Margulis certainly got me thinking about the dangers of reductionism and over-specialization when studying the processes of evolution. There are a multitude of mechanisms at work at all levels in the linear and inter-species cycles of evolution and the rise of individual species.

Natural selection, competition, co-operation, parasitism, mutualism, population genetics, fitness peaks, puntuated equilibrium, symbiogenesis, gaia -- it's all good.

Tags: , , , , .

Is the Adderall shortage on account of rampant off-label use?

So, apparently there's an Adderall drought going on the United States. Adderall is a prescription med that is used by people suffering from attention deficit disorder (ADD), attention deficit hyperactivity disorder (ADHD), and narcolepsy.  It's also being increasingly used as an off-label cognitive enhancer and for recreational purposes (which I'll get to in just a little bit).

According to Moe Tkaci, who is himself an ADD sufferer, a widespread shortage of the popular pill is "distracting a nation of Adderall users." Given the significant number of users, this isn't something to be taken lightly. Disruptions to prescription medication, particularly for conditions like ADD or narcolepsy, can be the cause of considerable stress — particularly if the drug, like Adderall, has addictive qualities.

Tkaci claims that the problem can be attributed to corporate greed and the industry's indifference to the drug's alleged addictive qualities. Tkaci writes,
The best of the addiction-based business models are "addiction-proof" addictive drug, and the Adderall story is at its core the saga of a nearly century-long quest for this unattainable ideal. Amphetamine salt—Adderall’s active ingredient—has been the subject of heady dispute within the medical profession since the drug company Smith, Kline and French began peddling the stuff in 1935, but for decades just about the only thing medical community generally agreed about was that it was not addictive. The SKF sales department did, however, have a term for the loyalty it engendered among consumers: “stick.”

The dawn of the Drug War eventually in the early 1970s eventually brought an end to the widespread use of those first-generation amphetamines, but naturally they "stuck" around in some circles. And then in the nineties, when upper-middle class America was stricken with a modern epidemic of ADD (and its “hyperactive” variant ADHD) necessitating widespread amphetamine use while simultaneously the nation’s truck stops and trailer parks began falling prey to the scourge of illegal amphetamines—and yet no one ever seemed to link the two—it appeared as though an element of cognitive dissonance about the stuff had also “stuck.” For the same reason crystal meth never found much "stick" as an ADD drug—although it's out there, under the brand name Desoxyn—Adderall users for the most part never identified as "addicts" before the nightmare shortages of this year.

You can map the spread of this rude (albeit unbearably drowsy) awakening on the message boards at ADDForums.com, whose administrators have painstakingly aggregated all Amphetamine Famine-relevant posts into a single "sticky" thread, starting with its early rumblings across flyover country in March, when the first unlucky ADD sufferers in pockets of Texas, Georgia and a few other states began to chronicle tales of the panicked multi-state manhunts and exorbitant ransoms to which they'd been subjected following the inevitable 15-minute pharmacy trip that wasn't. The real panic set in around mid-August, when a supply shock attributed to "back-to-school" season ravaged the suburbs.
Now, I'm sure that much of what Tkaci is claiming is true. But I'm very surprised that he failed to mention a rather important part of this issue: rampant off-label use. It seems obvious to me that a significant portion of the demand (and resultant shortage) of Adderall has to do with all those people who are taking it not because they suffer from any condition, but because they're taking advantage of its nootropic qualities; it's increasingly being used off-label as a cognitive enhancer.

While Adderall is being used off-label across multiple demographics, it's safe to say that students are the most common users. They're the ones looking for a cognitive edge. They're also more tapped into early-adopting neuroenhancing culture. And the numbers back up my suspicion.

Back in 2005, a University of Michigan study discovered that in the previous year 4.1% of American undergrads had taken prescription stimulants for off-label use. Other studies found even higher rates, including a 2002 study at a small college that found more than 35% of students using prescription stimulants nonmedically in the previous year.

And these studies are six and nine years old respectively. My own impression is that about 15 to 20% of students are probably using cognitive enhancers today—if not more.

Looking at the effects of Adderall, one can understand why this is happening. It has some rather remarkable qualities that lend itself to this type of off-label use. In 2009 the New Yorker published a piece, Brain Gain: The underground world of “neuroenhancing” drugs, in which Adderall and other drugs were investigated by Margaret Talbot. In the article she writes about "Alex," a student who claimed that he wouldn't have gotten through college without it:
Alex recalled one week during his junior year when he had four term papers due. Minutes after waking on Monday morning, around seven-thirty, he swallowed some “immediate release” Adderall. The drug, along with a steady stream of caffeine, helped him to concentrate during classes and meetings, but he noticed some odd effects; at a morning tutorial, he explained to me in an e-mail, “I alternated between speaking too quickly and thoroughly on some subjects and feeling awkwardly quiet during other points of the discussion.” Lunch was a blur: “It’s always hard to eat much when on Adderall.” That afternoon, he went to the library, where he spent “too much time researching a paper rather than actually writing it—a problem, I can assure you, that is common to all intellectually curious students on stimulants.” At eight, he attended a two-hour meeting “with a group focussed on student mental-health issues.” Alex then “took an extended-release Adderall” and worked productively on the paper all night. At eight the next morning, he attended a meeting of his organization; he felt like “a zombie,” but “was there to insure that the semester’s work didn’t go to waste.” After that, Alex explained, “I went back to my room to take advantage of my tired body.” He fell asleep until noon, waking “in time to polish my first paper and hand it in.”
What's particularly fascinating, and to me not very surprising, is that the use of Adderall in this setting is not strictly for gaining an edge over other students. Rather, as Alex notes, it's about using neuropharma to tap into the one's full potential. It's enhancement for the sake of being at one's best:
In fact, he said, “it’s often people”—mainly guys—“who are looking in some way to compensate for activities that are detrimental to their performance.” He explained, “At Harvard, at least, most people are to some degree realistic about it. . . . I don’t think people who take Adderall are aiming to be the top person in the class. I think they’re aiming to be among the best. Or maybe not even among the best. At the most basic level, they aim to do better than they would have otherwise.” He went on, “Everyone is aware of the fact that if you were up at 3 A.M. writing this paper it isn’t going to be as good as it could have been. The fact that you were partying all weekend, or spent the last week being high, watching ‘Lost’—that’s going to take a toll.”
Now, that said, Adderall is, like so many other seemingly "perfect drugs," not without side-effects. Some impairment is experienced:
Alex remains enthusiastic about Adderall, but he also has a slightly jaundiced critique of it. “It only works as a cognitive enhancer insofar as you are dedicated to accomplishing the task at hand,” he said. “The number of times I’ve taken Adderall late at night and decided that, rather than starting my paper, hey, I’ll organize my entire music library! I’ve seen people obsessively cleaning their rooms on it.” Alex thought that generally the drug helped him to bear down on his work, but it also tended to produce writing with a characteristic flaw. “Often, I’ve looked back at papers I’ve written on Adderall, and they’re verbose. They’re belaboring a point, trying to create this airtight argument, when if you just got to your point in a more direct manner it would be stronger. But with Adderall I’d produce two pages on something that could be said in a couple of sentences.” Nevertheless, his Adderall-assisted papers usually earned him at least a B. They got the job done. As Alex put it, “Productivity is a good thing.”
In addition, as noted by Tkaci, it also has a certain "stickiness" to it. It's a borderline addictive drug.

The main take-away from Talbot's piece is that cognitive enhancers, including Adderall, do very little to improve one's intelligence. At best it allows for extreme focus, clarity, and motivation—which after a very productive outburst, can give the impression of intelligence. We're still a way's off from having the kinds of drugs, whether on- or off-label, that can actually boost IQ to the degree (and for the duration) that many transhumanists desire. (That said, a case can be made that increased articulateness and focus is a kind of intelligence enhancement, and I'm not one to argue against that.)

And as far as the Adderall shortage is concerned, it's also worth noting that it is increasingly being used as a party drug. Users typically grind the pills into powder and snort it to ensure greater and more rapid impact. Given its ability to focus the user's attention and increase mental clarity, it's not surprising that the drug is being used in recreational settings such as clubbing. Now that Ecstasy is increasingly coming to be seen as a neuro-toxin, I suspect that some people are replacing it with Adderall. It's a viable substitute for achieving a similar kind of "high" and without the negative long-term side-effects (as far as we know).

Which is quite fascinating if you think about it. The current generation of drug users are moving away from psychedelia and more towards function. Today, impairment and a sense of otherworldliness is not as desirable as increased mental function and focus, particularly in social circumstances. People want to be at their best in these settings—something Adderall can help with.

At any rate, Adderall's off-label use is happening because users are getting the desired results. So much so, in fact, that I'm sure it's contributing to the drug's shortage in the United States these days. The time has come for the FDA and other institutions to stop sitting on their hands when it comes to this kind of use. It's clearly an issue that's not going away any time soon. Demand is demand, and so long as safety and efficacy can be assured, both regulators and developers need to take notice.

In fact, when off-label use is through the roof like this (i.e. when a certain critical mass is surpassed in terms of usage), regulators should compel drug developers to conduct further investigations into the so-called unintended effects of the drug. This will not only ensure the eventual safety and efficacy of these drugs, but it may also provide an opportunity for Pharma to market an entirely new set of drugs. In such cases, user demand is satiated, safety/efficacy assured, and Pharma gets to sell a drug that was developed for something entirely unintended. It's win all around.

But first thing's first; the concept of neuroenhancement needs to be normalized in society. We're not there yet, but it's happening.

November 22, 2011

Drug deaths now outnumber traffic fatalities in U.S

Propelled by an increase in prescription narcotic overdoses, drug deaths now outnumber traffic fatalities in the United States, a Times analysis of government data has found.

Drugs exceeded motor vehicle accidents as a cause of death in 2009, killing at least 37,485 people nationwide, according to preliminary data from the U.S. Centers for Disease Control and Prevention.
Source.

Facebook page created for the Sentient Developments podcast

I've created a Facebook page for the Sentient Developments podcast. Your next steps: Click, Like, stay up to date, pass it on.

November 21, 2011

Phil Plait: How to defend Earth from asteroids



What's six miles wide and can end civilization in an instant? An asteroid - and there are lots of them out there. With humor and great visuals, Phil Plait enthralls the TEDxBoulder audience with all the ways asteroids can kill, and what we must do to avoid them.

I love the approach Plait recommends; it's like jiu-jitsu meets astro-engineering.

Sentient Developments Podcast for the week of November 21, 2011

The Sentient Developments Podcast for the week of November 21, 2011 has been posted.

Topics discussed in this week's episode include the benefits of creatine, Jared Diamond's 1987 article on how agriculture was the "worst mistake in the history of the human race", the current state of lab grown meats, computational pathology, a review of the documentary "How to Live Forever", and a word (or two) on the pernicious de-radicalization of the radical future.

Podcast RSS feed. Subscribe via iTunes.

Tracks used in this episode:
  • Oneohtrix Point Never: "Replica"
  • The Advisory Circle: "Now Ends the Beginning"
  • Russian Circles: "309"
  • Hooray For Earth: "Pulling Back"

November 17, 2011

Creatine for mind, body, and longer life

Creatine molecule
Creatine is quickly becoming one of my favorite supplements, and not just because of the way it helps me in the gym. It's been shown that creatine can also be used as a nootropic and as a way to stave off potential neurodegeneration. Because earlier reports of damage to the kidneys and liver by creatine supplementation have now been scientifically refuted, creatine is becoming increasingly accepted as a powerful and multi-faceted daily supplement.

So what is it? Creatine a nitrogenous organic acid that occurs naturally in vertebrates and helps to supply energy to all cells in the body—primarily muscle. It's also been shown to assist in the growth of muscle fibres. Creatine achieves this by increasing the formation of Adenosine triphosphate (ATP). It is an osmotically active substance, so it pulls water into muscle cells. Creatine is naturally produced in the human body from amino acids primarily in the kidney and liver and is transported in the blood for use by muscles.

Back in the early 1990s it became common for bodybuilders, wrestlers, sprinters and other athletes to take creatine as word got out that it contributed to increased muscle mass and energy. Athletes began to consume two to three times the amount that could be obtained from a high protein diet. Creatine, which is typically bought in pills or flavored powders and mixed with liquid, increases the body's ability to produce energy rapidly. With more energy, athletes can train harder and more often, producing better results.

In fact, research shows that creatine is most effective in high-intensity training and explosive activities. This includes weight training and sports that require short bursts of effort, such as sprinting, football, and baseball. As a CrossFitter and an occasional user of creatine, I can certainly vouch for these effects. I believe that creatine is responsible for adding as much as five to twenty pounds to my lifts (depending on the kind of lift) along with an added boost of muscular endurance—two very desirable qualities for CrossFit athletes.

Recently I have switched from being an occasional user of creatine (3000 mg per day, cycling monthly) to a daily low dosage user (750 to 1500 mg per day every day) while on a high protein diet. I've done this for cost reasons while still hoping to take advantage of its benefits, which aren't just limited to the physical realm.

Indeed, creatine has been shown to have a significant impact on brain health. It's been shown to boost brain performance, including positive impacts on working intelligence and memory—both of which require improved cognitive processing speed. Back in 2003, a study showed that people who took creatine for six weeks scored better on tests measuring intelligence and memory than those who did not take it. And interestingly, some of the most significant cognitive benefits are experienced by vegetarians and vegans, probably on account of protein deficiencies (which has an impact on the body's ability  produce creatine naturally).

Moreover, while creatine can be used as a strength enhancer and a cognitive booster, it may also have an important role in the prevention and treatment of neurodegeneration. Creatine may offer protection to Alzheimer's patients. It's also being used by Parkinson's patients as way to slow the progression of the disease. These effects, combined with its beneficial impacts on strength and endurance (both important health factors for longevity), lead me to believe that creatine is an indispensable part of any life extension strategy.

Creatine can be found at most supplement stores. And now that its available in pill format it's become very easy and convenient to take. So give it a try and see if it works for you.

Jared Diamond: Agriculture the "worst mistake in the history of the human race."

Back in 1987 Jared Diamond wrote a piece for Discover magazine titled, "The Worst Mistake in the History of the Human Race." While I don't necessarily agree with all his claims, Diamond brings up some interesting points that serve to re-enforce the position held by those interested in the ancestral health movement and the Paleo diet. He contends that the advent of agriculture gave rise to not just diseases and epidemics, deep class divisions, and sexual inequality, but to poorer human health as well.

He writes:
One straight forward example of what paleopathologists have learned from skeletons concerns historical changes in height. Skeletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5' 9'' for men, 5' 5'' for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5' 3'' for men, 5' for women. By classical times heights were very slowly on the rise again, but modern Greeks and Turks have still not regained the average height of their distant ancestors.

Another example of paleopathology at work is the study of Indian skeletons from burial mounds in the Illinois and Ohio river valleys. At Dickson Mounds, located near the confluence of the Spoon and Illinois rivers, archaeologists have excavated some 800 skeletons that paint a picture of the health changes that occurred when a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150. Studies by George Armelagos and his colleagues then at the University of Massachusetts show these early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced by a bone condition called porotic hyperostosis), a theefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor. "Life expectancy at birth in the pre-agricultural community was bout twenty-six years," says Armelagos, "but in the post-agricultural community it was nineteen years. So these episodes of nutritional stress and infectious disease were seriously affecting their ability to survive."

The evidence suggests that the Indians at Dickson Mounds, like many other primitive peoples, took up farming not by choice but from necessity in order to feed their constantly growing numbers. "I don't think most hunger-gatherers farmed until they had to, and when they switched to farming they traded quality for quantity," says Mark Cohen of the State University of New York at Plattsburgh, co-editor with Armelagos, of one of the seminal books in the field, Paleopathology at the Origins of Agriculture. "When I first started making that argument ten years ago, not many people agreed with me. Now it's become a respectable, albeit controversial, side of the debate."

There are at least three sets of reasons to explain the findings that agriculture was bad for health. First, hunter-gatherers enjoyed a varied diet, while early fanners obtained most of their food from one or a few starchy crops. The farmers gained cheap calories at the cost of poor nutrition, (today just three high-carbohydrate plants -- wheat, rice, and corn -- provide the bulk of the calories consumed by the human species, yet each one is deficient in certain vitamins or amino acids essential to life.) Second, because of dependence on a limited number of crops, farmers ran the risk of starvation if one crop failed. Finally, the mere fact that agriculture encouraged people to clump together in crowded societies, many of which then carried on trade with other crowded societies, led to the spread of parasites and infectious disease. (Some archaeologists think it was the crowding, rather than agriculture, that promoted disease, but this is a chicken-and-egg argument, because crowding encourages agriculture and vice versa.) Epidemics couldn't take hold when populations were scattered in small bands that constantly shifted camp. Tuberculosis and diarrheal disease had to await the rise of farming, measles and bubonic plague the appearnce of large cities.
Quite obviously one can make a strong case that the agricultural revolution, while awful for human health, was profoundly beneficial to human civilization. Sure, agriculture and civilization has caused some problems, but we are consciously working to lessen the impacts of class divisions, sexual inequality, the ravages of disease, and our environmental impact on the planet. It's because of human civilization and its institutions that we have been able to develop science and technologies that are progressively helping us make this world a better place to live. Now, we have a considerable amount of work to do in this regard—and we may not succeed—but the overall humanitarian benefits that these technologies could bring in the future are substantial.

I'm not one to think that we should have stayed as hunters and gatherers. The human trajectory seems destined for something greater than both our Paleolithic infancy and our current post-industrial adolescence. Here's to hoping that humanity can reach an enlightened adulthood.

Oh, and in the meantime, eat Paleo.

Cynthia Kenyon: Experiments that hint of longer lives | TED


What controls aging? Biochemist Cynthia Kenyon has found a simple genetic mutation that can double the lifespan of a simple worm, C. Elegans. The lessons from that discovery, and others, are pointing to how we might one day significantly extend youthful human life.

November 14, 2011

Sentient Developments Podcast for the week of November 14, 2011

The latest episode of the Sentient Developments podcast has been posted.

iTunes people can subscribe here. Or you can just subscribe to the RSS. You can download the episode directly here (mp3).

This week I talk about octopus intelligence, the rise of wrongful birth suits in Israel and elsewhere, and the latest news and findings into autism. I also reprise the talk I gave on designer psychologies at the H+ conference at Parson's University in NYC earlier this year. Lastly, I discuss how religion works as a reproduction control system.

Music used in this episode:
  • "At Last" by Plaid
  • "Hours" by Tycho
  • "Ballad of Gloria Featherbottom" by Mux Mool

November 12, 2011

NS: The dope on mental enhancement


Susan Watts has penned an article for New Scientist in which she asks, "Yet another survey has revealed surprisingly large numbers of people using drugs to boost their mental powers. What should be done?" She writes:
So-called cognitive-enhancing drugs are usually prescribed to treat medical conditions, but they are also known for their ability to improve memory or focus. Many people buy them over the internet, which is risky because they don't know what they are getting. We also know next to nothing about their long-term effects on the brains of healthy people, particularly the young. But some scientists believe they could have a beneficial role to play in society, if properly regulated.

So who's taking what? The BBC's flagship current affairs show Newsnight and New Scientist ran an anonymous online questionnaire to find out. I also decided to try a cognitive enhancer for myself.

The questionnaire was completed by 761 people, with 38 per cent saying they had taken a cognitive-enhancing drug at least once. Of these, nearly 40 per cent said they had bought the drug online and 92 per cent said they would try it again.
In the article, Watts goes on to describe her experience on modafinil:
On the second day I felt more focused and in control and thought I performed better in the tests. That was the day I had been given modafinil. Rowe summed up my performance: "What we've seen today is some very striking improvements… in memory and, for example, your planning abilities and on impulsivity."
She also talked to transhumanist Anders Sandberg:
It's not just students who claim to find the drug beneficial. Anders Sandberg of the Future of Humanity Institute at the University of Oxford talks openly about using cognitive-enhancing drugs. He is about to start a study in Germany to compare the effects of a range of cognitive enhancers, including two hormones – ghrelin, which promotes hunger, and oxytocin, which is associated with empathy – to test their powers at what he calls "moral enhancement".

"Once we have figured out how morality works as an emotional and mental system there might be ways of improving it," he told me.
Read more.

November 7, 2011

Sentient Developments podcast for the week of November 7, 2011

The latest episode of the Sentient Developments podcast has been posted.

This week I talk about my recent trip to Burning Man.

iTunes people can subscribe here. Or you can just subscribe to the RSS. You can download the episode directly here (mp3).

November 6, 2011

Orion: Deep inside the mind of the octopus

It's long been suspected that some cephalopods, like the octopus, are highly intelligent and sophisticated animals. In fact, it's because of this suspicion that I've short-listed them as a candidate species at the IEET's Rights of Non-Human Persons program (including the African grey parrot). Now the good news is that the science is increasingly backing up these assumptions. Check out this fascinating article in Orion: Deep Intellect: Inside the mind of the octopus. Highlights:
Only recently have scientists accorded chimpanzees, so closely related to humans we can share blood transfusions, the dignity of having a mind. But now, increasingly, researchers who study octopuses are convinced that these boneless, alien animals—creatures whose ancestors diverged from the lineage that would lead to ours roughly 500 to 700 million years ago—have developed intelligence, emotions, and individual personalities. Their findings are challenging our understanding of consciousness itself.
And:
Measuring the minds of other creatures is a perplexing problem. One yardstick scientists use is brain size, since humans have big brains. But size doesn’t always match smarts. As is well known in electronics, anything can be miniaturized. Small brain size was the evidence once used to argue that birds were stupid—before some birds were proven intelligent enough to compose music, invent dance steps, ask questions, and do math.

Octopuses have the largest brains of any invertebrate. Athena’s is the size of a walnut—as big as the brain of the famous African gray parrot, Alex, who learned to use more than one hundred spoken words meaningfully. That’s proportionally bigger than the brains of most of the largest dinosaurs.

Another measure of intelligence: you can count neurons. The common octopus has about 130 million of them in its brain. A human has 100 billion. But this is where things get weird. Three-fifths of an octopus’s neurons are not in the brain; they’re in its arms.

“It is as if each arm has a mind of its own,” says Peter Godfrey-Smith, a diver, professor of philosophy at the Graduate Center of the City University of New York, and an admirer of octopuses. For example, researchers who cut off an octopus’s arm (which the octopus can regrow) discovered that not only does the arm crawl away on its own, but if the arm meets a food item, it seizes it—and tries to pass it to where the mouth would be if the arm were still connected to its body.

“Meeting an octopus,” writes Godfrey-Smith, “is like meeting an intelligent alien.” Their intelligence sometimes even involves changing colors and shapes. One video online shows a mimic octopus alternately morphing into a flatfish, several sea snakes, and a lionfish by changing color, altering the texture of its skin, and shifting the position of its body. Another video shows an octopus materializing from a clump of algae. Its skin exactly matches the algae from which it seems to bloom—until it swims away.
Read the entire article.



Economist: Difference engine and the Luddite legacy

There's a good article in the Economist about the automation revolution that's in full swing: Difference Engine: Luddite legacy. I find it strange that more people aren't talking about this, instead choosing to gripe about economic globalization, or blaming current unemployment rates on either inept governments or greedy corporations. It's becoming increasingly obvious that technologies are replacing jobs in a way that's been predicted for nearly two centuries. The difference now, however, is that both blue and white collar jobs (i.e. knowledge workers) are in danger of being replaced by automation systems.

Some snippets from the article worth reading:
The conventional explanation for America's current plight is that, at an annualised 2.5% for the most recent quarter (compared with an historical average of 3.3%), the economy is simply not expanding fast enough to put all the people who lost their jobs back to work. Consumer demand, say economists like Dr Tyson, is evidently not there for companies to start hiring again. Clearly, too many chastened Americans are continuing to pay off their debts and save for rainy days, rather than splurging on things they may fancy but can easily manage without.

There is a good deal of truth in that. But it misses a crucial change that economists are loth to accept, though technologists have been concerned about it for several years. This is the disturbing thought that, sluggish business cycles aside, America's current employment woes stem from a precipitous and permanent change caused by not too little technological progress, but too much. The evidence is irrefutable that computerised automation, networks and artificial intelligence (AI)—including machine-learning, language-translation, and speech- and pattern-recognition software—are beginning to render many jobs simply obsolete.

This is unlike the job destruction and creation that has taken place continuously since the beginning of the Industrial Revolution, as machines gradually replaced the muscle-power of human labourers and horses. Today, automation is having an impact not just on routine work, but on cognitive and even creative tasks as well. A tipping point seems to have been reached, at which AI-based automation threatens to supplant the brain-power of large swathes of middle-income employees.

That makes a huge, disruptive difference. Not only is AI software much cheaper than mechanical automation to install and operate, there is a far greater incentive to adopt it—given the significantly higher cost of knowledge workers compared with their blue-collar brothers and sisters in the workshop, on the production line, at the check-out and in the field.
And this:
The process has clearly begun. And it is not just white-collar knowledge workers and middle managers who are being automated out of existence. As data-analytics, business-intelligence and decision-making software do a better and cheaper job, even professionals are not immune to the job-destruction trend now underway. Pattern-recognition technologies are making numerous highly paid skills redundant.

Radiologists, who can earn over $300,000 a year in America, after 13 years of college education and internship, are among the first to feel the heat. It is not just that the task of scanning tumour slides and X-ray pictures is being outsourced to Indian laboratories, where the job is done for a tenth of the cost. The real threat is that the latest automated pattern-recognition software can do much of the work for less than a hundredth of it.

Lawyers are in a similar boat now that smart algorithms can search case law, evaluate the issues at hand and summarise the results. Machines have already shown they can perform legal discovery for a fraction of the cost of human professionals—and do so with far greater thoroughness than lawyers and paralegals usually manage.
Read the entire article.