Ok, I'm glad I was clear enough and you were able to master the idea so quickly... an idea which is pretty simple at the end. It's just the usual limit trick but on a higher level with all of its pros and cons.

1) Now the problem I see coming: \(f(y,s)\) seems not really easily invertible... also you note the domain of \(f^{-1}\). Rightly so. In my toy model we are working in groups, everything is invertible and we need not to worry about domains. But in this case I feel that this becomes pretty fat, as a problem. I experimented alot back in the days in iterating the sub-function operator. Obviously the more you iterate it on something non-surjective, the more the domain shrinks...

1.5) I haven't tried to invert you construction nor computing the algebra, but inverting in the \(y\) sounds like we need Lambert function \(W\) somewhere. Also... I tried to apply the limit formula to the, easier, Bennet's family... and I totally don't see what should look polynomial... I still can't see nothing behaving as a polynomial. Polynomial in the \(n\)? I'm truly lost xD

2) The second point... I have a fear when you see it converges quickly... when iterating the operator \(\Sigma_s(f)=fsf^{-1}\) we have that \(s\) is a fixed point, everyone here is aware of that, also Tommy made some comments on it as a carrier of potential problems. It is a fixed point even when \(s\) is not chosen to be the successor. In fact... we do not know how much it is "attracting".

Some functions quickly converge (an I mean after few iterations) on a shrinking-but-stabilizing domain to the seed function \(s\). I believe that the least number of iteration that sends the resulting function \(\Sigma^n_s(f)\) locally into a small neighborhood of \(s\) -I'm thinking here informally as some kind of pointwise convergence over a function space- should be called local-rank. It means that locally a function looks like the \(n\)-th solution to the equation \(xs=hx\) where \(h\) is an \((n-1)\)-'th solution.

To sum it up: the operation of "shifting and sub-functioning" does indeed have as its fixed points solutions to the goodstein equation. Maybe locally if we extend to non-bijective functions. The problem is that also \((s,s,s,s,....)\) is a solution, where \(s\) is the seed of the family. How much strongly it attracts arbitrary non-goodstein sequences?

I ask that because I tried on desmos to compute the limit of

\[\lim (\Sigma_S \circ S^*)^n({\bf bennet})=\lim (\Sigma_S \circ S^*)^n(\oplus)\]

I know, it's not what you are using, but it seems to to solidly converge to the successor after just \(3\) iterations. With solidly I mean that I added the slider for the base \(b\) in \(b\oplus_s x= \exp^n (\ln^n b +\ln^n x)\). And after the third iteration, the slider does not perturb the shape of the graphs of \( \left( (\Sigma_S \circ S^*)^n (\oplus) \right)_3\) anymore and the function lies in the strip \(x+1+\rho\) for \(x>N\) and \(\rho \in [-\epsilon,\epsilon]\).

So... how strong is the basing of attraction of the trivial goodstein sequence? How to escape it? I know, my corollary was meant to prove that if we have convergence of a non constant sequence of operations, then the limit was going to be a non-constant solution. Tbh I need to check again the argument... I believe it is true in the theory over groups... but outside it... idk.

A VISUAL EXPLAINATION OF THE METHOD

If someone finds hard to visualize what is happening, here the picture: we start with a continuous family of binary operations, eg. Bennett's or James's variant \(\exp^s_{y^{1/y}}(\ln^s_{y^{1/y}}(x)+y )\).

remember that it is a continuous one this time

Given an operator \(b* y\) we can ask for its super-operator \(*^+\) and it's sub operator \(*^-\). Call \(*^+\) the solution of the equation \(b*^+(y+1)=b*(b*^+ y)\) and call \(*^-\) the solution of \(b*(y+1)=b*^-(b*y)\). Assume we can iterate this procedure we obtain a new family \(*^0=*\), \(*^{n+1}=(*^n)^+\) and \(*^{n-1}=(*^n)^-\).

Here the vertical links are the superfunction/subfunction relations. Let's apply this to our initial continuous family, e.g. Bennet's one

We obtain a grid of operations. Traveling along the columns we have "iterate"/"take subfunction" process.

But traveling the horizontal direction is NOT taking the "Bennett process" \(b*y \mapsto e^{\ln b * \ln y}\) if we are not moving along the the 0-th row. This means that the grid is not commutative (horizontal moves don't commute with vertical moves). Every column is a discrete sequence that satisfies the Goodstein equation, but only the continuous family at the 0th row satisfies bennet.

The grid itself seems something very interesting on its own, the non-commutativity implies that is can be extended in and infinite number of direction. A first study of this kind was initiated in december 2020/may 2021 by Jaramillo (Hyperoperations in exponential fields). Giving names to the nodes of the grid makes it look like this

Now we try to visualize the limit formula. We want to find a continuous family of binary operations \(g_{s}(b,y)\) s.t. \[g_{s+1}(b,y+1)=g_{s}(b,g_{s+1}(b,y))\]

We want to obtain that as a fixed point of a family \(\lim_{n\to \infty}g_{n,s}(b,y)=g_s(b,y)\). To do that we need a first approximation \(g_{0,s}(b,y)\).

We want \[g_{n,s+1}(b,y+1)=g_{n+1,s}(b, g_{n,s+1}(b,y))\]

Clearly \(g_{n+1,s}(b,y)\) is the subfunction of \(g_{n,s+1}(b,y)\) \[\boxed{g_{n+1,s}=g_{n,s+1}^-}\]

If we set \(g_{0,s}=\oplus_s^0\), i.e. as the first approximation we use Bennet, we get \(g_{1,s}=\oplus^{-1}_{s+1}\), \(g_{2,s}=\oplus^{-2}_{s+2}\) and \(g_{n,s}=\oplus^{-n}_{s+n}\)

As we can see we are iterating, shifting and subfunction. We expect the limit to be a fixed point, hence a continuous solution of the goodstein equation.

\[\boxed{g_{s}=\lim g_{n,s}=\lim_{n\to\infty}(\Sigma_S\circ S^*)^n(\oplus)_s=\lim \oplus_{s+n}^{-n}}\]

1) Now the problem I see coming: \(f(y,s)\) seems not really easily invertible... also you note the domain of \(f^{-1}\). Rightly so. In my toy model we are working in groups, everything is invertible and we need not to worry about domains. But in this case I feel that this becomes pretty fat, as a problem. I experimented alot back in the days in iterating the sub-function operator. Obviously the more you iterate it on something non-surjective, the more the domain shrinks...

1.5) I haven't tried to invert you construction nor computing the algebra, but inverting in the \(y\) sounds like we need Lambert function \(W\) somewhere. Also... I tried to apply the limit formula to the, easier, Bennet's family... and I totally don't see what should look polynomial... I still can't see nothing behaving as a polynomial. Polynomial in the \(n\)? I'm truly lost xD

2) The second point... I have a fear when you see it converges quickly... when iterating the operator \(\Sigma_s(f)=fsf^{-1}\) we have that \(s\) is a fixed point, everyone here is aware of that, also Tommy made some comments on it as a carrier of potential problems. It is a fixed point even when \(s\) is not chosen to be the successor. In fact... we do not know how much it is "attracting".

Some functions quickly converge (an I mean after few iterations) on a shrinking-but-stabilizing domain to the seed function \(s\). I believe that the least number of iteration that sends the resulting function \(\Sigma^n_s(f)\) locally into a small neighborhood of \(s\) -I'm thinking here informally as some kind of pointwise convergence over a function space- should be called local-rank. It means that locally a function looks like the \(n\)-th solution to the equation \(xs=hx\) where \(h\) is an \((n-1)\)-'th solution.

To sum it up: the operation of "shifting and sub-functioning" does indeed have as its fixed points solutions to the goodstein equation. Maybe locally if we extend to non-bijective functions. The problem is that also \((s,s,s,s,....)\) is a solution, where \(s\) is the seed of the family. How much strongly it attracts arbitrary non-goodstein sequences?

I ask that because I tried on desmos to compute the limit of

\[\lim (\Sigma_S \circ S^*)^n({\bf bennet})=\lim (\Sigma_S \circ S^*)^n(\oplus)\]

I know, it's not what you are using, but it seems to to solidly converge to the successor after just \(3\) iterations. With solidly I mean that I added the slider for the base \(b\) in \(b\oplus_s x= \exp^n (\ln^n b +\ln^n x)\). And after the third iteration, the slider does not perturb the shape of the graphs of \( \left( (\Sigma_S \circ S^*)^n (\oplus) \right)_3\) anymore and the function lies in the strip \(x+1+\rho\) for \(x>N\) and \(\rho \in [-\epsilon,\epsilon]\).

So... how strong is the basing of attraction of the trivial goodstein sequence? How to escape it? I know, my corollary was meant to prove that if we have convergence of a non constant sequence of operations, then the limit was going to be a non-constant solution. Tbh I need to check again the argument... I believe it is true in the theory over groups... but outside it... idk.

A VISUAL EXPLAINATION OF THE METHOD

If someone finds hard to visualize what is happening, here the picture: we start with a continuous family of binary operations, eg. Bennett's or James's variant \(\exp^s_{y^{1/y}}(\ln^s_{y^{1/y}}(x)+y )\).

remember that it is a continuous one this time

Given an operator \(b* y\) we can ask for its super-operator \(*^+\) and it's sub operator \(*^-\). Call \(*^+\) the solution of the equation \(b*^+(y+1)=b*(b*^+ y)\) and call \(*^-\) the solution of \(b*(y+1)=b*^-(b*y)\). Assume we can iterate this procedure we obtain a new family \(*^0=*\), \(*^{n+1}=(*^n)^+\) and \(*^{n-1}=(*^n)^-\).

Here the vertical links are the superfunction/subfunction relations. Let's apply this to our initial continuous family, e.g. Bennet's one

We obtain a grid of operations. Traveling along the columns we have "iterate"/"take subfunction" process.

But traveling the horizontal direction is NOT taking the "Bennett process" \(b*y \mapsto e^{\ln b * \ln y}\) if we are not moving along the the 0-th row. This means that the grid is not commutative (horizontal moves don't commute with vertical moves). Every column is a discrete sequence that satisfies the Goodstein equation, but only the continuous family at the 0th row satisfies bennet.

The grid itself seems something very interesting on its own, the non-commutativity implies that is can be extended in and infinite number of direction. A first study of this kind was initiated in december 2020/may 2021 by Jaramillo (Hyperoperations in exponential fields). Giving names to the nodes of the grid makes it look like this

Now we try to visualize the limit formula. We want to find a continuous family of binary operations \(g_{s}(b,y)\) s.t. \[g_{s+1}(b,y+1)=g_{s}(b,g_{s+1}(b,y))\]

We want to obtain that as a fixed point of a family \(\lim_{n\to \infty}g_{n,s}(b,y)=g_s(b,y)\). To do that we need a first approximation \(g_{0,s}(b,y)\).

We want \[g_{n,s+1}(b,y+1)=g_{n+1,s}(b, g_{n,s+1}(b,y))\]

Clearly \(g_{n+1,s}(b,y)\) is the subfunction of \(g_{n,s+1}(b,y)\) \[\boxed{g_{n+1,s}=g_{n,s+1}^-}\]

If we set \(g_{0,s}=\oplus_s^0\), i.e. as the first approximation we use Bennet, we get \(g_{1,s}=\oplus^{-1}_{s+1}\), \(g_{2,s}=\oplus^{-2}_{s+2}\) and \(g_{n,s}=\oplus^{-n}_{s+n}\)

As we can see we are iterating, shifting and subfunction. We expect the limit to be a fixed point, hence a continuous solution of the goodstein equation.

\[\boxed{g_{s}=\lim g_{n,s}=\lim_{n\to\infty}(\Sigma_S\circ S^*)^n(\oplus)_s=\lim \oplus_{s+n}^{-n}}\]

MSE MphLee

Mother Law \((\sigma+1)0=\sigma (\sigma+1)\)

S Law \(\bigcirc_f^{\lambda}\square_f^{\lambda^+}(g)=\square_g^{\lambda}\bigcirc_g^{\lambda^+}(f)\)