Tracks used in this episode:
- Opeth: "Heritage"
- Cynic: "Amidst the Coals"
- Cynic: "Hieroglyph"
- Steven Wilson: "Grace for Drowning"
Subscribe via iTunes
Many different types of crowd disturbance have bubbled up during 2011, but perhaps the oddest category has been the “flash mob robbery,” or “flash rob.”
It’s a fad that started in Washington, D.C. back in April, when around 20 people filed into a high-end jeans store in Dupont Circle and quickly made off with $20,000 in stock. Since then, the practice has spread — Dallas, Las Vegas, Ottawa, and Upper Darby, Pa. have all reported incidents since then — though the targets have gotten a bit more downscale, with most of the thefts taking place in convenience stores.
The latest crowd theft took place Saturday night at a 7-Eleven in Silver Spring, Md., and it fit the familiar pattern. Kids pour into the store, calmly help themselves to merchandise, and then stream out again:
A medical ethicist explains the dark implications of corporate medical patents and the nightmarish scenario of our medical-industrial complex.
When Nakamoto’s paper came out in 2008, trust in the ability of governments and banks to manage the economy and the money supply was at its nadir. The US government was throwing dollars at Wall Street and the Detroit car companies. The Federal Reserve was introducing “quantitative easing,” essentially printing money in order to stimulate the economy. The price of gold was rising. Bitcoin required no faith in the politicians or financiers who had wrecked the economy—just in Nakamoto’s elegant algorithms. Not only did bitcoin’s public ledger seem to protect against fraud, but the predetermined release of the digital currency kept the bitcoin money supply growing at a predictable rate, immune to printing-press-happy central bankers and Weimar Republic-style hyperinflation.Entire article.
Nakamoto himself mined the first 50 bitcoins—which came to be called the genesis block—on January 3, 2009. For a year or so, his creation remained the province of a tiny group of early adopters. But slowly, word of bitcoin spread beyond the insular world of cryptography. It has won accolades from some of digital currency’s greatest minds. Wei Dai, inventor of b-money, calls it “very significant”; Nick Szabo, who created bit gold, hails bitcoin as “a great contribution to the world”; and Hal Finney, the eminent cryptographer behind RPOW, says it’s “potentially world-changing.” The Electronic Frontier Foundation, an advocate for digital privacy, eventually started accepting donations in the alternative currency.
Is the search for extraterrestrial intelligence, or SETI, solid science, pseudoscience, or something else, as Massimo [Pigliucci] argues in his book "Nonsense on Stilts"? What are the theoretical foundations and empirical evidence that justify a multi-decade research program, and what are its chances of succeeding? Have we learned anything thanks to SETI? Also, if the universe is infinite, what problems does this pose for utilitarian ethics?
The best of the addiction-based business models are "addiction-proof" addictive drug, and the Adderall story is at its core the saga of a nearly century-long quest for this unattainable ideal. Amphetamine salt—Adderall’s active ingredient—has been the subject of heady dispute within the medical profession since the drug company Smith, Kline and French began peddling the stuff in 1935, but for decades just about the only thing medical community generally agreed about was that it was not addictive. The SKF sales department did, however, have a term for the loyalty it engendered among consumers: “stick.”Now, I'm sure that much of what Tkaci is claiming is true. But I'm very surprised that he failed to mention a rather important part of this issue: rampant off-label use. It seems obvious to me that a significant portion of the demand (and resultant shortage) of Adderall has to do with all those people who are taking it not because they suffer from any condition, but because they're taking advantage of its nootropic qualities; it's increasingly being used off-label as a cognitive enhancer.
The dawn of the Drug War eventually in the early 1970s eventually brought an end to the widespread use of those first-generation amphetamines, but naturally they "stuck" around in some circles. And then in the nineties, when upper-middle class America was stricken with a modern epidemic of ADD (and its “hyperactive” variant ADHD) necessitating widespread amphetamine use while simultaneously the nation’s truck stops and trailer parks began falling prey to the scourge of illegal amphetamines—and yet no one ever seemed to link the two—it appeared as though an element of cognitive dissonance about the stuff had also “stuck.” For the same reason crystal meth never found much "stick" as an ADD drug—although it's out there, under the brand name Desoxyn—Adderall users for the most part never identified as "addicts" before the nightmare shortages of this year.
You can map the spread of this rude (albeit unbearably drowsy) awakening on the message boards at ADDForums.com, whose administrators have painstakingly aggregated all Amphetamine Famine-relevant posts into a single "sticky" thread, starting with its early rumblings across flyover country in March, when the first unlucky ADD sufferers in pockets of Texas, Georgia and a few other states began to chronicle tales of the panicked multi-state manhunts and exorbitant ransoms to which they'd been subjected following the inevitable 15-minute pharmacy trip that wasn't. The real panic set in around mid-August, when a supply shock attributed to "back-to-school" season ravaged the suburbs.
Alex recalled one week during his junior year when he had four term papers due. Minutes after waking on Monday morning, around seven-thirty, he swallowed some “immediate release” Adderall. The drug, along with a steady stream of caffeine, helped him to concentrate during classes and meetings, but he noticed some odd effects; at a morning tutorial, he explained to me in an e-mail, “I alternated between speaking too quickly and thoroughly on some subjects and feeling awkwardly quiet during other points of the discussion.” Lunch was a blur: “It’s always hard to eat much when on Adderall.” That afternoon, he went to the library, where he spent “too much time researching a paper rather than actually writing it—a problem, I can assure you, that is common to all intellectually curious students on stimulants.” At eight, he attended a two-hour meeting “with a group focussed on student mental-health issues.” Alex then “took an extended-release Adderall” and worked productively on the paper all night. At eight the next morning, he attended a meeting of his organization; he felt like “a zombie,” but “was there to insure that the semester’s work didn’t go to waste.” After that, Alex explained, “I went back to my room to take advantage of my tired body.” He fell asleep until noon, waking “in time to polish my first paper and hand it in.”What's particularly fascinating, and to me not very surprising, is that the use of Adderall in this setting is not strictly for gaining an edge over other students. Rather, as Alex notes, it's about using neuropharma to tap into the one's full potential. It's enhancement for the sake of being at one's best:
In fact, he said, “it’s often people”—mainly guys—“who are looking in some way to compensate for activities that are detrimental to their performance.” He explained, “At Harvard, at least, most people are to some degree realistic about it. . . . I don’t think people who take Adderall are aiming to be the top person in the class. I think they’re aiming to be among the best. Or maybe not even among the best. At the most basic level, they aim to do better than they would have otherwise.” He went on, “Everyone is aware of the fact that if you were up at 3 A.M. writing this paper it isn’t going to be as good as it could have been. The fact that you were partying all weekend, or spent the last week being high, watching ‘Lost’—that’s going to take a toll.”Now, that said, Adderall is, like so many other seemingly "perfect drugs," not without side-effects. Some impairment is experienced:
Alex remains enthusiastic about Adderall, but he also has a slightly jaundiced critique of it. “It only works as a cognitive enhancer insofar as you are dedicated to accomplishing the task at hand,” he said. “The number of times I’ve taken Adderall late at night and decided that, rather than starting my paper, hey, I’ll organize my entire music library! I’ve seen people obsessively cleaning their rooms on it.” Alex thought that generally the drug helped him to bear down on his work, but it also tended to produce writing with a characteristic flaw. “Often, I’ve looked back at papers I’ve written on Adderall, and they’re verbose. They’re belaboring a point, trying to create this airtight argument, when if you just got to your point in a more direct manner it would be stronger. But with Adderall I’d produce two pages on something that could be said in a couple of sentences.” Nevertheless, his Adderall-assisted papers usually earned him at least a B. They got the job done. As Alex put it, “Productivity is a good thing.”In addition, as noted by Tkaci, it also has a certain "stickiness" to it. It's a borderline addictive drug.
Propelled by an increase in prescription narcotic overdoses, drug deaths now outnumber traffic fatalities in the United States, a Times analysis of government data has found.Source.
Drugs exceeded motor vehicle accidents as a cause of death in 2009, killing at least 37,485 people nationwide, according to preliminary data from the U.S. Centers for Disease Control and Prevention.
One straight forward example of what paleopathologists have learned from skeletons concerns historical changes in height. Skeletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5' 9'' for men, 5' 5'' for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5' 3'' for men, 5' for women. By classical times heights were very slowly on the rise again, but modern Greeks and Turks have still not regained the average height of their distant ancestors.Quite obviously one can make a strong case that the agricultural revolution, while awful for human health, was profoundly beneficial to human civilization. Sure, agriculture and civilization has caused some problems, but we are consciously working to lessen the impacts of class divisions, sexual inequality, the ravages of disease, and our environmental impact on the planet. It's because of human civilization and its institutions that we have been able to develop science and technologies that are progressively helping us make this world a better place to live. Now, we have a considerable amount of work to do in this regard—and we may not succeed—but the overall humanitarian benefits that these technologies could bring in the future are substantial.
Another example of paleopathology at work is the study of Indian skeletons from burial mounds in the Illinois and Ohio river valleys. At Dickson Mounds, located near the confluence of the Spoon and Illinois rivers, archaeologists have excavated some 800 skeletons that paint a picture of the health changes that occurred when a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150. Studies by George Armelagos and his colleagues then at the University of Massachusetts show these early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced by a bone condition called porotic hyperostosis), a theefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor. "Life expectancy at birth in the pre-agricultural community was bout twenty-six years," says Armelagos, "but in the post-agricultural community it was nineteen years. So these episodes of nutritional stress and infectious disease were seriously affecting their ability to survive."
The evidence suggests that the Indians at Dickson Mounds, like many other primitive peoples, took up farming not by choice but from necessity in order to feed their constantly growing numbers. "I don't think most hunger-gatherers farmed until they had to, and when they switched to farming they traded quality for quantity," says Mark Cohen of the State University of New York at Plattsburgh, co-editor with Armelagos, of one of the seminal books in the field, Paleopathology at the Origins of Agriculture. "When I first started making that argument ten years ago, not many people agreed with me. Now it's become a respectable, albeit controversial, side of the debate."
There are at least three sets of reasons to explain the findings that agriculture was bad for health. First, hunter-gatherers enjoyed a varied diet, while early fanners obtained most of their food from one or a few starchy crops. The farmers gained cheap calories at the cost of poor nutrition, (today just three high-carbohydrate plants -- wheat, rice, and corn -- provide the bulk of the calories consumed by the human species, yet each one is deficient in certain vitamins or amino acids essential to life.) Second, because of dependence on a limited number of crops, farmers ran the risk of starvation if one crop failed. Finally, the mere fact that agriculture encouraged people to clump together in crowded societies, many of which then carried on trade with other crowded societies, led to the spread of parasites and infectious disease. (Some archaeologists think it was the crowding, rather than agriculture, that promoted disease, but this is a chicken-and-egg argument, because crowding encourages agriculture and vice versa.) Epidemics couldn't take hold when populations were scattered in small bands that constantly shifted camp. Tuberculosis and diarrheal disease had to await the rise of farming, measles and bubonic plague the appearnce of large cities.
So-called cognitive-enhancing drugs are usually prescribed to treat medical conditions, but they are also known for their ability to improve memory or focus. Many people buy them over the internet, which is risky because they don't know what they are getting. We also know next to nothing about their long-term effects on the brains of healthy people, particularly the young. But some scientists believe they could have a beneficial role to play in society, if properly regulated.In the article, Watts goes on to describe her experience on modafinil:
So who's taking what? The BBC's flagship current affairs show Newsnight and New Scientist ran an anonymous online questionnaire to find out. I also decided to try a cognitive enhancer for myself.
The questionnaire was completed by 761 people, with 38 per cent saying they had taken a cognitive-enhancing drug at least once. Of these, nearly 40 per cent said they had bought the drug online and 92 per cent said they would try it again.
On the second day I felt more focused and in control and thought I performed better in the tests. That was the day I had been given modafinil. Rowe summed up my performance: "What we've seen today is some very striking improvements… in memory and, for example, your planning abilities and on impulsivity."She also talked to transhumanist Anders Sandberg:
It's not just students who claim to find the drug beneficial. Anders Sandberg of the Future of Humanity Institute at the University of Oxford talks openly about using cognitive-enhancing drugs. He is about to start a study in Germany to compare the effects of a range of cognitive enhancers, including two hormones – ghrelin, which promotes hunger, and oxytocin, which is associated with empathy – to test their powers at what he calls "moral enhancement".Read more.
"Once we have figured out how morality works as an emotional and mental system there might be ways of improving it," he told me.
Only recently have scientists accorded chimpanzees, so closely related to humans we can share blood transfusions, the dignity of having a mind. But now, increasingly, researchers who study octopuses are convinced that these boneless, alien animals—creatures whose ancestors diverged from the lineage that would lead to ours roughly 500 to 700 million years ago—have developed intelligence, emotions, and individual personalities. Their findings are challenging our understanding of consciousness itself.And:
Measuring the minds of other creatures is a perplexing problem. One yardstick scientists use is brain size, since humans have big brains. But size doesn’t always match smarts. As is well known in electronics, anything can be miniaturized. Small brain size was the evidence once used to argue that birds were stupid—before some birds were proven intelligent enough to compose music, invent dance steps, ask questions, and do math.Read the entire article.
Octopuses have the largest brains of any invertebrate. Athena’s is the size of a walnut—as big as the brain of the famous African gray parrot, Alex, who learned to use more than one hundred spoken words meaningfully. That’s proportionally bigger than the brains of most of the largest dinosaurs.
Another measure of intelligence: you can count neurons. The common octopus has about 130 million of them in its brain. A human has 100 billion. But this is where things get weird. Three-fifths of an octopus’s neurons are not in the brain; they’re in its arms.
“It is as if each arm has a mind of its own,” says Peter Godfrey-Smith, a diver, professor of philosophy at the Graduate Center of the City University of New York, and an admirer of octopuses. For example, researchers who cut off an octopus’s arm (which the octopus can regrow) discovered that not only does the arm crawl away on its own, but if the arm meets a food item, it seizes it—and tries to pass it to where the mouth would be if the arm were still connected to its body.
“Meeting an octopus,” writes Godfrey-Smith, “is like meeting an intelligent alien.” Their intelligence sometimes even involves changing colors and shapes. One video online shows a mimic octopus alternately morphing into a flatfish, several sea snakes, and a lionfish by changing color, altering the texture of its skin, and shifting the position of its body. Another video shows an octopus materializing from a clump of algae. Its skin exactly matches the algae from which it seems to bloom—until it swims away.
The conventional explanation for America's current plight is that, at an annualised 2.5% for the most recent quarter (compared with an historical average of 3.3%), the economy is simply not expanding fast enough to put all the people who lost their jobs back to work. Consumer demand, say economists like Dr Tyson, is evidently not there for companies to start hiring again. Clearly, too many chastened Americans are continuing to pay off their debts and save for rainy days, rather than splurging on things they may fancy but can easily manage without.And this:
There is a good deal of truth in that. But it misses a crucial change that economists are loth to accept, though technologists have been concerned about it for several years. This is the disturbing thought that, sluggish business cycles aside, America's current employment woes stem from a precipitous and permanent change caused by not too little technological progress, but too much. The evidence is irrefutable that computerised automation, networks and artificial intelligence (AI)—including machine-learning, language-translation, and speech- and pattern-recognition software—are beginning to render many jobs simply obsolete.
This is unlike the job destruction and creation that has taken place continuously since the beginning of the Industrial Revolution, as machines gradually replaced the muscle-power of human labourers and horses. Today, automation is having an impact not just on routine work, but on cognitive and even creative tasks as well. A tipping point seems to have been reached, at which AI-based automation threatens to supplant the brain-power of large swathes of middle-income employees.
That makes a huge, disruptive difference. Not only is AI software much cheaper than mechanical automation to install and operate, there is a far greater incentive to adopt it—given the significantly higher cost of knowledge workers compared with their blue-collar brothers and sisters in the workshop, on the production line, at the check-out and in the field.
The process has clearly begun. And it is not just white-collar knowledge workers and middle managers who are being automated out of existence. As data-analytics, business-intelligence and decision-making software do a better and cheaper job, even professionals are not immune to the job-destruction trend now underway. Pattern-recognition technologies are making numerous highly paid skills redundant.Read the entire article.
Radiologists, who can earn over $300,000 a year in America, after 13 years of college education and internship, are among the first to feel the heat. It is not just that the task of scanning tumour slides and X-ray pictures is being outsourced to Indian laboratories, where the job is done for a tenth of the cost. The real threat is that the latest automated pattern-recognition software can do much of the work for less than a hundredth of it.
Lawyers are in a similar boat now that smart algorithms can search case law, evaluate the issues at hand and summarise the results. Machines have already shown they can perform legal discovery for a fraction of the cost of human professionals—and do so with far greater thoroughness than lawyers and paralegals usually manage.