## Writing Big Numbers, Practically

I was reading a book yesterday called The Emergence of Number by J. N. Crossley, and it got me wondering: Is there a way to write any natural number, no matter how large, inside of a fixed finite writing space (say a 1 by 1 cm square) using only a single notational convention?

For example, you could write the number $100$ as $10 \times 10$, and that would be one convention. Or you could write it as $10^2$, and that would be another. Clearly the first wouldn’t work. It would eventually lead me to a number that runs over the side of my 1 x 1 box. For example, I might get ${10}\times{10}\times{10}\times{10}\times{10}\times{10}\times{10}\times{10}\times{10}$. What about the second? For example, I might get something like $10^{10^{10^{10^{10^{10^{10^{10}}}}}}}$. Depending on what the ratio of size of the exponent to the size of the base numeral is, it may be possible to answer yes to my question. There is a practical problem, though. Once you get smaller than the atomic scale, how would you represent the succeeding exponents? Even assuming that you could reduce the size indefinitely, limiting toward zero, this answer seems unsatisfactory. It’s like proposing to make an automobile more aerodynamic by simply shrinking it to the size of a marble. Well, it’s not really useful for driving anymore, so what’s the point?

I wonder, is there a convention which automatically rescales naturally as the value of the number in question increases? Besides the obvious, of course, which is to simply invent a new convention when the number hits the side of the box. My gut feeling is no, such a convention couldn’t exist. Maybe an evolutionary algorithm of some kind would work. But that seems to me to be stretching the rules of the problem–it’s not really deterministic in the everyday sense. I couldn’t rederive such a convention, for instance.

What if we alter the question so that an arbitrary, finite number of conventions can be used? Is it possible then? I don’t think so. We’d have to cycle through the conventions eventually. Even if we take the set of conventions and derive new conventions combinatorially (i.e. cycle once through the $n$ conventions one by one, and when you reach the one you started with, combine it some way with a second convention to form a convention $n+1$.), we’d eventually run out of combinations.

This is giving me a headache, honestly. :p

### 2 responses to this post.

1. Hey John,

So you want to write arbitrarily large numbers in a finite amount of space? Kolmogorov complexity theory might have something to say about that.

First of all, there’s this counting argument: Suppose your square contains a finite number (call it N) of pixels, which can be “on” or “off” in any representation of a number. Then you can only ever represent 2^N distinct numbers in that square.

More interestingly, Kolmogorov complexity says that very few numbers have a “compact” representation such as 10^10^…^10. For most numbers, the simplest way to write them is to write out all their digits. If you change your number representation system, you change which numbers are simple, but the average number of simple numbers doesn’t change. This is again proven by a counting argument.

If you find this interesting “Introduction to Kolmogorov Complexity” by Li and Vitanyi is one of the most interesting math/cs books I’ve read.