Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While I was able to follow the Aaronson article with no particular trouble, I can't seem to grok this. I'd be very grateful to anyone who could give a slightly lower level explanation.


So, this is about trees labelled with positive integers at their vertices. (To be formal: let's call them "i-trees", a name I just made up; then an i-tree consists of a positive integer "root" together with a possibly-empty list of i-tree "children", and is of finite size.)

Here's a way of classifying some i-trees as "smaller" than others. Say that the following operations make an i-tree smaller: (1) Reducing one of the integers that appear in it. (2) Deleting any subtree. (3) Replacing a node with one of its subtrees. And say that one i-tree is smaller than another if you can transform the latter into the former by a sequence of these basic steps, and not otherwise.

Finally, for any positive integer n, say that an n-tree is an i-tree in which all those positive integers are at most n.

It turns out that the smaller-than relation that I just described is a "well-quasi-order" (don't blame me, I didn't make up the terminology) on i-trees, which means: If you take any infinite sequence of i-trees -- T1, T2, T3, ... -- then there must be i,j with i<j and Ti smaller-than-or-equal-to Tj. This probably sounds like a bizarre and arbitrary notion, but it's useful because when a set is well-quasi-ordered you can do a kind of mathematical induction on it. (Mathematical induction is the thing that's usually described as "if something's true for 1, and true for n+1 whenever it's true for n, then it's true for all positive integers" but generalizes to something like "if something's true about any object provided it's true about all simpler objects, then it's true for all objects"; what's needed for this to work is approximately for "simpler" to be a well-quasi-order. I am skipping many details, but this is just a motivational aside anyway.)

There's a sort of finite version of that theorem, which goes like this: For any n, if you consider n-trees rather than unrestricted i-trees then there's a fixed N such that whenever you have T1, ..., TN there are two for which the earlier is smaller-than-or-equal-to the later.

How big is N, given n? The answer is the TREE function. It's nice and small for n=1 and n=2 but becomes absolutely enormous for n=3.

The very rapid growth of the TREE function is related to another cool fact: In a certain sense that I'm not going to bother making formal here, you cannot prove the theorem without making essential use of infinite quantities. (One can even quantify, kinda, just how much infinity is required.)

If you happen to like this stuff, here's another theorem (Goodstein's theorem, if you want to look it up) with the same sort of flavour. You already know, I take it, how to write a number in base n: one way to say it is that you write it as a sum of terms of the form [smaller than n] times n^k. Well, suppose you do that and then do the same thing to all the exponents, and to all the exponents that that produces, and so on, until you've expressed your number in terms of (1) numbers smaller than n and (2) the operation "a times n ^ b". Let's say that you've then written your number "in superbase n".

OK. Now, consider the following process. Start with any number. Write it in superbase 2. Replace all the 2s with 3s (so you now have something in superbase 3). Subtract 1; if necessary, fix up the superbase-3-ness. Replace all the 3s with 4s. Subtract 1 and fix up. Replace all the 4s with 5s. And so on.

Typically, the "replace n with n+1" operation increases the number enormously, and you might think that your numbers will increase without limit. But, as it happens, you always get down to 1 eventually -- but (1) the number of steps it takes to get there grows outrageously fast and (2) again, you can't prove this without (in some sense) using infinite quantities.

(For any readers who know about infinite ordinal numbers, here's the one-line proof: At each step, when you have the number written in superbase k, replace all the k's with omegas. Then the "add 1 to the superbase" steps do nothing, and it's easy to see that the "subtract 1" steps always strictly decrease your number. But the ordinals are well-founded, so you can't strictly decrease infinitely many times. QED.)


yes, can someone explain? please and thank you!




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: