10 is ten to the power of 1. 100 is ten to the power of 2. This can be written more concisely as 102. 1000 is 103. And so on. 10150 is 1 followed by 150 zeroes.
This works the other way, as well. 0.1 – one tenth – is 10-1. 0.01 is 10-2. 10-150 is zero, followed by a decimal point, followed by 149 zeroes, followed by 1.
If you want to multiply two powers of 10 together, you can add the superscripted value. Thus, 100x100 – 102x102 is 104, or 10000. This works regardless of the sign of the superscript. 1080x10-25 is 1055.
William Dembski says that the universal probability bound is 10-150. What he means by this is that a specified event that is less probable than this value will not happen by chance. Unspecified improbable events happen by chance all the times – for example, if I toss a coin 500 times, I will obtain a sequence of 500 heads and tails which has an improbability of arising of about 10-150. This is an unspecified event. But supposing I write down a sequence of 500 heads and tails, starting as follows:
HTHHHTTHTTHTHHTTTHTHHTTTTTTHTTHT...- and then toss a coin 500 times and get precisely this sequence of heads and tails – this is now a specified event.
This value of 10-150 is based on the number of measurable time steps (Planck Intervals – if I remember right, this is the time it takes a photon to travel the width of an atomic nucleus) that will occur in the universe, multiplied by the total number of fundamental particles in the universe. In effect, what Dembski is saying is that if the probability of a specified event is such that if everything in the whole universe was doing nothing for the entire history of the universe other than trying to match this specified event, and it was still less than likely that the event would happen at random, then it is a specified event that is less probable than the universal probability bound – and in this case, that this specified event should have happened by chance is less likely than that it should not have happened by chance.
Go back to those coins. Imagine a sequence of 500 heads. The UPB says that, if I toss a coin 500 times in a row, and it comes up heads every time, then it is reasonable to rule out “chance” as an explanation. In this case, it is unlikely that we would persist for 500 tosses anyway; most likely we would give up after less than 20 and assume the coin was loaded or something sneaky was happening.
However, suppose that we have very fast computers that are producing genuinely random numbers. These could generate sequences of simulated heads and tails much faster than humans. If such a computer could do a million simulated “coin tosses” per second, then it would expect to generate a sequence of 20 heads once per second, since the probability of such a sequence arising by chance is roughly one in a million (10-6).
But the UPB says that no matter how many people you have tossing coins, no matter how many simulated coin tosses you have on computers, no matter how fast they are, you will never see a sequence of 500 heads by chance. If you do, it is more likely that there is something happening in the computer which has caused it to generate this specific sequence.
The same applies to monkeys writing Shakespeare. The perception has always been: “Well, if enough monkeys type quickly enough, then surely it's only a matter of time before Hamlet's soliloquy appears.” Okay, then, let's consider a brief excerpt.
TOMORROW AND TOMORROW AND TOMORROW CREEPS IN THIS PETTY PACE FROM DAY TO DAY AND ALL OUR YESTERDAYS HAVE LIGHTED FOOLS THE WAY TO DUSTY DEATH
I make that 141 characters, from a range of 27 (A-Z plus space, to give the monkeys as much of a break as possible). How many possible sets of 141 characters are there? The answer is 27141. That's around 10200. So the probability of this sequence of letters being typed at random is the reciprocal of this – that is, 10-200. No matter how many monkeys you have, how fast they type, or even how many times you get incredibly fast computers to generate 141 character strings at random, you will not see this specified phrase generated at random, because it lies beyond the universal probability bound, of 10-150.
Now, the subtle thing here is the relationship between 10-150 and 10-200. People look at these two numbers and forget what those superscripts mean. It doesn't sound that far from one to the other – people have in mind the 50 difference between the superscripts, and subconsciously think that with all that improbability, something that is only something like 50 times less probable is really neither here nor there – they aren't all that far apart.
But suppose a specified event has a probability of 10-151. That means it is ten times less improbable than something with a probability of 10-150. That means that you need 10 universes all doing nothing but trying out random events to get to the improbability boundary of the event.
Take Macbeth's words above. The probability of this string being generated at random is such that you need 1050 universes, all doing nothing but generating random strings throughout their history, for this string to be likely to appear.
So what's the point of this? Well, I suppose what I'm getting at is that specified information is special. You don't need much of it to leave any random process that could be conceived of in the universe floundering. And yet, there's a lot of it about.