NKS Midwest

The computational capacity of the universe

In the afternoon session of NKS Midwest 2008, we had several remote talks by video conference. The first was by Seth Llyod, speaking on his famous calculation “the computational capacity of the universe”. He gave a nice background to the idea in thermodynamics of entropy and how it tied in with Shannon information, and then the stages he went through in coming up with it. Originally he had the question of the “ultimate laptop”, how fast could one possible get a single kilogram of matter in a one liter volume to compute, subject only to the limits of QM, arriving at the figure of 10^51 ops/second on 10^31 bits. The basic idea is then to extend this calculation to the knowable universe, and derive from it an upper bound on the logical operations (or, distinctions possible, perhaps more accurately) in the history of the universe.

When explaining entropy he got an audience question of typical epistemological bent, claiming that information is a relation between a system and us, not something about the system. He shrugged this off with the statement that entropy is info we don’t know about the system and info is info we do know about the system, but the total info is what is2nd-law non-decreasing. He was on the other hand careful to distinguish mere information processing in the sense of logical operations from computation in the sense of universality. Someone makes the claim that the universe is capable of universal computation and presents as evidence our own practical computers; they wouldn’t be able to do it if the universe cannot. There is a potential objection from cardinality, that universality needs an unbounded store, or to add memory as needed. This relates to the cosmology question whether the universe is infinitely extendable.

I thought he could have done a better job with this, as the point of his calculation is to show the universe to date is only capable of a finite number of operations, yet everything we see was able to result from it, including all of our practical computers and their actual flexibility. To me this shows that infinite cardinality is not actually required for everything we know, empirically, as either practical computation or physical complexity. Our mathematical idealizations in computation theory make use of a single countable infinity, that the real world cases are not seen to require in practice. One can say this shows our computers or the universe are not universal, or one can say the mathematical definition of universality misses somewhat. I prefer the latter, not being wedded to that abstraction.

He next explained the QM origins of the processing speed limit, both the Heisenberg limit on the time to go from one distinguishable state to another related to an energy spread, and the Margolis limit related to the absolute energy. QC has a tendency to operate at the Margolis limit, which is the most one could expect, QM being posited of course. This gives around 10^123 ops since the big bang, maximum. The maximum memory calculation on the other hand comes from a volume calculation and the amount of information that can reside on the boundary of a given volume (evolution inside said boundary being deterministic etc). The finite volume comes with the qualifier “knowable” applied to universe – take a light cone from us into the past as far as the big bang, then forward from that region at light speed. Note that in principle this means a larger present region could access more memory than “here”, and a future one likewise more than “now”.

One then asks, what is the max entropy of this much energy (bounded below by the critical density, since the universe is seen to expand) in that much volume, and one gets 10^92 bits, which he noted are about the amount stored in the black body radiation (most of it, in other words). He speculated that maybe this figure can be pushed higher when dark energy is allowed for, but that was frankly a bit hand-wavy. He wanted to point out these numbers seem to be similar to the universe age over the planck time to various powers (3/2 for 10^92 and 2 for 10^123, one hopes, since the former is about 5×10^60 planck times), but it isn’t really exact.

Overall it was a fun talk, on an argument I had read before. He was clear and the history of physics bits on Maxwell, Boltzman, Gibbs, Planck, Heisenberg, Shannon, and Margolis were interesting.

Here is what I like most about this sort of argument, from my own philosophic perspective. So often we are presented with QM indeterminacy or continuous value cardinality as reasons for expecting hypercomputation or a universe that exceeds all finite grasp, but in fact the theory itself has a quite different operational tendency, rigorously limiting the operational distinctions possible and requiring “a distinction” to have a clear physical meaning. Said clear physical meanings always having an actual “spannedness” or positive measure in both time and energy terms. In a walnutshell, if QM is true then the universe is rigorously finite in information-theoretic terms.

But two provisos have to be allowed to that statement, for the partisans of actual infinities (I think e.g. of Max Tegmark). One, Llyod is speaking of a knowable (in principle) region of space-time and not making claims about inaccessible infinities beyond it, pro or con. And two, he nowhere addressed the many-worlds interpretation or any possible “contemporary orthogonals” it might posit. So one might allow a “knowable” between “the” and “universe” in the last sentence of the previous paragraph.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s