User blog comment:Superraptor1234/Pyloplex/@comment-30754445-20170702200124/@comment-25810596-20170702205611

No need to be sorry! It just means either it's a mathematical error on my part or an error in my ability to explain :) I just appreciate anyone taking the time to listen!

So basically think about variable creation in some "infinite" storage space computer.

We can set a variable in a program as i = 10^10^100. Awesome. So next we can say, j = i^i^100i, and then k = j^j^100j, and so on, until we run out of symbols.

So in this problem, basically think computer = universe. So that we have all these bits that can be 0 or 1. There is an uncomputable (but finite) number of symbols that can be "formed" from these bits, like ASCII or Unicode. Basically we repeat the above process until we run out of said symbols (this might be similar to Rayo's number... But I'm not positive, I have to confess I don't quite understand the proofs that I've seen for it thus far).