How big can it be?

Eliezer Yudkowsky presents a thought experiment: is it better for 3^^^3 people to get a dust speck in their eye — a nearly undetectable annoyance for each of them — or is it better that one person be tortured for fifty years? Within a utilitarian framework, this is designed to highlight the question of whether utility aggregates forever. (If you reject a more fundamental premise, for instance that it’s meaningful to compare the suffering of different people, then this isn’t the most useful thought experiment to dwell on.) In other words, would an enormous number of occurrences of an almost infinitesimally bad thing be far worse than one occurrence of the worst thing that can occur on the scale of a single human life?

Someone* wanted to mock rationalists and they decided to pick this thought experiment as a target. The blogger “rubegoldbergsaciddreams” presents an illuminating critique of their misrepresentation. I’ve cut out a lot of the content to focus just on the parts pertaining to how big 3^^^3 is.

Paraphrasing the dust speck thought experiment as “torturing one person for five years is indisputably, literally, MATHEMATICALLY better than letting a billion people get a dust speck in their eyes” is misrepresenting the original claim on a literally unthinkable level. 3^^^3 is so much larger than a billion that the words “scope insensitivity” fail to describe the depth of the error being made here. If you were to scale the original question to “a billion people having dust specks in their eyes”, the amount of time the individual would be tortured for is literally undetectable. It’s less than amount of time it takes light to travel the Planck length. It’s so much less than the Planck time that I actually cannot give any analogy sufficient to explain it. If it were not physically impossible to construct a machine capable of torturing someone for such a period of time, they would never feel it. Their neurons would not be able to fire, their body would not be able to react to the stimuli. The actual comparison would be “One person experiencing literally no adverse effects is better than a billion people getting a dust speck in their eyes”, which isn’t actually fair because WE LIVE IN A WORLD WHERE TIME AND SPACE ARE DISCRETE AND I HAVE TO ROUND EPSILON DOWN TO ZERO.The volume of the known universe, measured in planck lengths cubed (8.711375 * 10^187), is less than 3^^^3.

If you recorded every thing ever spoken by a human to an audio file (estimated to take up about 42 Zettabytes) and then flipped 336000000000000000000000 coins, one for each bit of data, the probability that your coin flips would perfectly correspond to the data is greater than 1/3^^^3.

Actually, I can do better than that. There are ~ 10^82 atoms in the universe. If you assigned each of those atoms a random location anywhere in the universe (8.711375 * 10^187), the probability that each of those atoms ends up exactly where it started is greater than 1/3^^^3.If you see a number like 3^^^3, you should conceptualize it as being about the same size as infinity. You’ll still be wrong about it, you’ll still think of it as much, much smaller than it actually is, but you might at least avoid making the kind of mistake that [blogger] did here.

The reason such a huge number was used in this case was because it was needed to get the point across: the counterfactual to EY’s claim (that 3^^^3 dust specks in eyes is better than 1 person being tortured for 50 years) implies that there’s a discontinuity in utility aggregation, and if we really believe that such a thing exists, than it’s very important to discuss where that line should be drawn and why. EY believes that no such discontinuity exists, and that’s a perfectly defendable position: others disagree, and there are merits to their beliefs as well.

Okay, so 3^^^3 is bigger than some puny number like how many ways there are to rearrange every atom in the visible universe, but can’t we do better?

Before I launch into calculations, let’s give a primer on the up arrow notation. We’ll want to start with more familiar arithmetic operators. Multiplication is:

a*b = a+a+\cdots+a (b appearances of a)

Exponentiation is:

a^b = a*a*\cdots*a (b appearances of a)

Double-arrow operation (“tetration”) is:

a^^b = a^a^\cdots^a (b appearances of a in a “power tower”)

Triple-arrow operation (“pentation”) is:

a^^^b = a^^a^^\cdots^^a (b appearances of a)

Hopefully the generalization to more arrows is straightforward, but we won’t need more than three arrows in this post. One important point is that although addition and multiplication are commutative, meaning the right-hand sides of their equations above can be executed in any order, exponentiation and subsequent operations are not. For example,

3^3^3 = 3^(3^3) = 3^27 = 7,625,597,484,987
\neq (3^3)^3 = 27^3 = 19,683

By convention, we evaluate exponents and subsequent operators from right to left if parentheses don’t explicitly call for another order. Also, numbers like 3^3^3 are more commonly written as 3^{3^{3}}, but since this gets unwieldy with more than one superscript, I’m going to stick to the caret notation.

It’s harder for me to think about triple-arrow notation than double-arrow notation, so let’s start by reducing 3^^^3. By definition, 3^^^3 = 3^^(3^^3), so this is a tower of exponentiated 3’s that is 3^^3 = 3^3^3 = 7,625,597,484,987 digits high.

So that’s our target: 3^^7,625,597,484,987.

There are about 10^80 electrons, protons, and neutrons in the observable universe, and about a billion photons (mostly in the cosmic microwave background) for each of those. Let’s shuffle all of them, for about 10^89 particles total. The smallest physically meaningful region is thought to be a Planck volume; there are about 10^185 of these volumes in the visible universe. So the total number of ways to shuffle all these particles among all these positions is

(10^185)^(10^89) = 10^(185 * 10^89) ~ 10^91 < 10^10^2.

 Compare this with a base-3 double-arrow number, 3^^4 = 10^10^12. Indeed, we’re not doing so well.

What if we re-shuffle everything at every moment, in increments of the Planck time, since the Big Bang up to the present? There have been about 10^61 Planck times, so the total number of possible outcomes is

((10^185)^(10^89))^(10^61) = 10^10^(2+89+61)
= 10^10^152 ~ 10^10^10^2.

We found a number bigger than 3^^4! But 3^^5 is approximately 10^10^10^12, already unimaginably larger than 10^10^152.

So far, we’ve been proceeding as follows: imagine a bunch of conditions with many independent outcomes, compound them, and look at the total number of outcomes. In that spirit, suppose we have a N conditions each of which has the same huge number of outcomes A — let’s imagine A = 3^^4 = 10^10^12, for instance. When we compound them, we get a total number of outcomes equal to (((\cdotsA)^A)^A)\cdots^A, with N appearances of A. Pulling the exponents down, we see that this is A^(A^(N-1)).

Now consider one final scenario. Suppose there are A = 3^^4 ~ 10^10^12 different combinations of fundamental particle types that might be present at any point in space. (This is far, far larger than we have any reason to think.) Further, suppose a universe is 10^10^12 Planck volumes and has existed for 10^10^12 Planck times (again, both far larger than in our universe). If we randomly pick a combination of particles to be present at every point and at every time, and repeat this for 10^10^12 universes, then N=4 because we have conditions over particle types, positions, times, and universes. The total number of possible evolutions of all these universes is then

(3^3^3^3)^((3^3^3^3)^3) ~ 10^10^10^10^12
~ 3^^6.

If I had managed to compound this process over a preposterous number of conditions — not just particle types, positions, times, and universes, but over 10^10^12 different things — then the result would have been the same up to a rounding error — namely about 10^10^10^10^12 ~ 3^^6.

Stated in terms of probabilities: if you populated every point in space with a random sample of particles drawn from an unimaginably large collection of options, in each of an unimaginably large number of universes, independently at every moment, then the chance that every one of them retraced exactly the history of our universe (with its apparently smooth trajectories of particles and simple laws of physics that don’t at all suggest “every point in space is occupied by something totally random at each moment independently) would be unfathomably more favorable than winning a bet at odds of 1:(3^^^3), even if our universe were unimaginably bigger and older than it actually is.

Let’s revisit the recommendation of rubegoldbergsaciddreams: “If you see a number like 3^^^3, you should conceptualize it as being about the same size as infinity. You’ll still be wrong about it, you’ll still think of it as much, much smaller than it actually is, but you might at least avoid making the kind of mistake that [blogger] did here.”

I tried to follow this advice; I really did. I took a number, 10^10^12, that is — let’s face it — about the same size as infinity. I rose it to a power of itself, then rose the result to 10^10^12 again, and repeated 10^10^12 times just for good measure. Even this near-infinite compounding of near-infinities doesn’t come close to reaching 3^^^3.

At this point, I’m out of ideas; I can’t describe a scenario in words, even a completely preposterous one, that gives me a number significantly bigger than 3^^6. And 3^^7,625,597,484,987 is so sickeningly larger than 3^^6 that I don’t know what to make of it. I knew it would be big, but I’m still flabbergasted.

*I don’t know anything about the blogger who made the original critique of Yudkowsky’s thought experiment, so I’ll assume they don’t want to be highlighted negatively by a complete stranger who has not even bothered to read their relevant post. Anyone who cares to find the original author can always do so.

2 thoughts on “How big can it be?

Leave a Reply

Your email address will not be published.

To write LaTeX output, type $latex [input]$ with [input] replaced by LaTeX input.