[via Charlie's Diary
] Charlie Stross has a great post
about there being a possible computational limit in the universe. According to a paper
by Lawrence M. Krauss and Glenn D. Starkman, there's a hard limit to the amount of computation that can be done in the universe if -- as currently observed -- it is expanding at an accelerating rate:
The duo calculated that the total number of computer bits that could be processed in the future would be less than 1.35x10x120. This means that the effective information available to any observer within the event horizon of an expanding universe will be significantly less than the total so-called Hawking-Beckenstein entropy -- the entropy that is associated with a black hole -- in the universe.
As Stross correctly points out, there's no need to panic about the prospect of an upper bound:
Let's not rush around screaming just yet: the universe isn't about to halt on us. To put this in human terms, Hans Moravec expounds an estimate for the computational complexity of a human brain of around 10x14 ops/sec. I'm inclined to think he errs on the optimistic side by at least 3, and more likely 6-9, orders of magnitude, but it's hard to see a human brain requiring more than 10x17 MIPS to simulate accurately down to the synaptic level. Elsewhere, speculative posthumanists as Robert Bradbury discuss the amount of computation you can do with the entire mass of a solar system -- it's only about 10x20-25 times higher than my upper (conservative) limit for a human brain. And by the time we move on to discussions of the computational bandwidth of a Kardashev Type III civilization some really big numbers are flying around. But we're still about 10x60 step-units below the upper bound derived by Krauss and Starkman. That figure of 1.35x10x120 corresponds to about 10x40 times the number of elementary particles in the observable universe. So we aren't going to run out of bits any time soon, at least not in human terms.
Post a Comment