User blog:P進大好きbot/New Issue on Traditional Analyses

I recently noticed a problem in traditional analyses using OCFs. I could not understand reasons, but many analysts somehow believe that OCFs are something like computable functions. If you remember the definition of the notion of a computable function, then you will soon understand that OCFs are not computable.

Is it problematic? Actually, yes, it is. In order to analyse a computable function in terms of ordinals in FGH, we need to fix a recursive system of fundamental sequences of ordinals below a sufficiently large ordinal. Otherwise, analyses are non-sense, because FGH heavily depends on the choice of fundamental sequences.

Usually, we use FGH based on a "reasonable" recursive system of fundamental sequence. For example, we use Wainer hierarchy when we deal with ordinals below \(\varepsilon_0\). In this case, we can harmlessly omit the declaration of the choice of a recursive system of fundamental sequences.

On the other hand, if we deal with ordinals above \(\psi_0(\Omega_{\Omega_{\cdot_{\cdot_{\cdot}}}})\) with respect to extended Buchholz's OCF, there is no agreed-upon common choice of a recursive system of fundamental sequences. Since FGH heavily based on the choice of fundamental sequences, then analyses of that level without the declaration of a choice of fundamental sequences are non-sense.

Nevertheless, people tend to omit the declaration. It might be because people belive the following: The statement 1 is obviously wrong, because they are ill-defined. For example, UNOCF is not an OCF at all. Even if we regard it as an arithmetic notation, it has no agreed-upon common system of fundamental sequences, no agreed-upon common recursive subset of standard forms, or no agreed-upon common well-defined correspondence to ordinals. Even if you personally define a system of fundamental sequences, it does not necessarily terminating, because it is not an ordinal notation.
 * 1) Using UNOCF, Pi notation, catching function, BEAF, and so on can solve this problem.
 * 2) For any countable ordinals \(\alpha\) and \(\beta\), \(\alpha < \beta\) implies \(f_{\alpha} \ll f_{\beta}\) with respect to any system of fundamental sequence.
 * 3) There is a canonical choice of a recursive system of fundametal sequences if we work with a fixed OCF.

The statement 2 is also wrong. Namely, larger ordinals are not necessarily correspond to greater functions with respect to eventual domination. See the example here. You might think that you do not choose such a "strange" system of fundamental sequences, but actually you might choose one if you are working large ordinals, which are beyond your comprehension.

The statement 3 is also wrong. If you are working with an OCF beyond Buchholz's OCF, you usually need a comparison \(<\) of ordinals in order to define fundamental sequences. Since we need fundamental sequences for analyses, the system of fundamental sequences should be recursive. For this purpose, we need an algorithm to compute the comparison in terms of expressions of ordinals. For this purpose, we need an ordinal notation associated to the OCF.

If you do not understand the difference of an ordinal notation and an OCF, see the explanation here. An OCF does not necessarily yield an ordinal notation, because it is really hard to encode the \(\in\)-relation of ordinals into a recursive binary relation of expressions.

For example, I guess that almost all authors of analyses with OCFs based on large cardinal axioms in this community do not have explicit algorithms to compute the comparison. It implies that they do not have explicit algorithms to compute fundamental sequences. Without fundamental sequences, how could we use FGH? Should we "guess" fundamental sequences based on wrong estimations such as \(\psi_0(\psi_1(\psi_2(\psi_3(0)))) = \psi_0(\Omega_3)\) in Buchholz's OCF and \(\psi_{\Omega}(\psi_I(0)) = \psi_{\Omega_1}(\Omega_{\Omega_{\cdot_{\cdot_{\cdot}}}})\) in Rathjen's standard OCF based on the least weakly Mahlo cardinal?

I note that several OCFs actually admit associated ordinal notations. For example, Rathjen's standard OCF based on the least weakly Mahlo cardinal admits an associated ordinal notation. The construction heavily uses the complicated conditions in the definition of the OCF, and hence if we "simplify" the OCF, then we will lose the algorithm to compute the comparison. Since people tend to use "simplified" OCFs instead of standard OCFs in order to avoid the complexity, this is still problematic.

At least, hyp cos told me that he does not have an explicit algorithm to compute fundamental sequences for a simplified OCF based on the least weakly compact cardinal, even though he regard it as one easy to understand. I respect him very well, and hence regard him as one of the greatest analyst in this community. Even he, the great analyst, does not have an explicit algorith to compute the comparison for a notation associated to a subjectively easy OCF. Then it is reasonable to guess that there are few analysts in this community who know an explicit algorithm to compute comparison of that level.

Then how could we believe traditional analyses of that level? Here, I have a simple proposal. How about declaring a fixed recursive system of fundamental sequences when we deal with ordinals beyond \(\psi_0(\Omega_{\Omega_{\cdot_{\cdot_{\cdot}}}})\) with respect to extended Buchholz's OCF?

Of course, this includes a proposal not to use ill-defined stuffs such as UNOCF, Pi notation, catching function, BEAF, and so on in analyses. This might be painful for several analysts, but I believe that this will make the community better. At least, stating something like "your OCF is weaker than UNOCF" is the same as "your array notation is weaker than BEAF" or "your number is smaller than ∞", and hence makes the community worse.