I'm going to give a rundown of my theory so far. It is pointing to a solution, but I do not have the solution yet. I've shared some of my observations, but I haven't explained them all. I was going to attach a pdf, which is essentially the working part of my theory--but I thought I'd hold off until I can make this more concrete. It's just, I'd estimate half of the work necessary for this construction. But this seems very promising.

We're trying to show, in no words less, that the modified Bennet operators, can be corrected to be holomorphic semi-operators.

To be clear, we give a dictionary of our variables:

\[

\begin{align}

x &> e\\

y & >e\\

&\exp^{\circ s}_{y^{1/y}}(w) \,\,\text{is the repelling Schroder iteration about the repelling fixed point}\,\,y\,\,\text{valid for}\,\,w>e\\

x[s]y &= \exp^{\circ s}_{y^{1/y}}\left(\log^{\circ s}_{y^{1/y}}(x) + y\right)\\

x\langle s\rangle_{\varphi}y &= \exp^{\circ s}_{y^{1/y}}\left(\log^{\circ s}_{y^{1/y}}(x) + y + \varphi\right)\\

\end{align}

\]

Where the ultimate goal is to describe the implicit function \(\varphi(s,x,y)\) such that:

\[

x \langle s \rangle y = x\langle s\rangle_{\varphi}y\\

\]

And this operator satisfies the Goodstein equations:

\[

x \langle s \rangle \left(x \langle s+1\rangle y\right) = x \langle s+1\rangle (y+1)\\

\]

Obviously this is a very difficult problem. I have talked to Mphlee through PMs a bit about this, and he has helped tremendously. But I'm still not there. But what I am very close to is a rough draft of a solution. I have many ways of describing what the solution should look like, how \(\varphi\) looks. From this I've created a few observations crucial to the study.

There's an abel identity integral to this solution.

Let's write

\[

f(s,y) = x \langle s+1\rangle_\varphi^{-1}y\\

\]

Which is the inverse of \(x \langle s+1\rangle_\varphi y\) in \(y\). The original modified bennet operators, satisfy:

\[

f(s,x[s]y) = f(s,y) + 1 + o(y^{\epsilon}) \,\,\text{for all}\,\,\epsilon>0\\

\]

The error grows logarithmically, if you are curious. The exact solution is given as:

\[

f(s,x\langle s\rangle y) = f(s,y) + 1\\

\]

For some value \(f\) and some exact function \(\varphi\).

MphLee's family is normal.

There's a very specific family of function I defined off of Mphlee's comments and descriptions. In many ways, it's a family of functions \(x \langle s \rangle_\varphi y\) which are in the neighborhood of solving the above equations. This reduces into functions \(g(s,y) = x \langle s \rangle_\varphi y\), such that they satisfy the crucial identity:

\[

g^{-1}(s+1,g(s,y)) = g^{-1}(s+1,y) + 1 + o(y^\epsilon)\\

\]

While additionally interpolating addition, multiplication and exponentiation. The most central result I've been able to show, is that the modified Bennet operators \(f(s,y) = x[s]y\) satisfy this identity. And by proxy, elements which are in a neighborhood satisfy this identity. I've come to call this Mphlee's family. As it relates greatly to his study of rank operators, and the semi-operators from his lens (at least functorially).

This family is normal if you set \(y>Y\), which means it is locally bounded everywhere. And here is where things kick in to highgear. Every sequence of functions in MphLee's family has a converging subsequence. It is compact.

From here we introduce the dark element. The solution exists somewhere in here. We just need to find a working iteration within this theory that works.... Here it is.

We are going to make a change of variables, so that:

\[

x \langle s+1 \rangle_\varphi \alpha_\varphi(s,y) = y\\

\]

And we are going to look at the value \(\varphi\) such that:

\[

\alpha_\varphi(s,x\langle s\rangle_\varphi y) - \alpha_\varphi(s,y) - 1 = 0\\

\]

This value \(\varphi\) always exists. From this, we know that:

\[

\alpha_\varphi\\

\]

Is the Abel function of \(x \langle s \rangle_\varphi y\). What this tells us next, is the very very very interesting part. Recalling that \(\varphi\) moves with \(s\) and \(y\). We can safely conclude that:

\[

\varphi(s,x\langle s \rangle_\varphi y) = \varphi(s,y)\\

\]

In which, it is idempotent. This means, if I write out our expressions in more detail:

\[

\begin{align}

A(s,y) &= \alpha_{\varphi(s,y)}(s,y)\\

x \langle s \rangle y &= x \langle s \rangle_{\varphi(s,y)} y\\

\end{align}

\]

Then:

\[

\begin{align}

A(s,x \langle s \rangle y) = A(s,y) + 1\\

A(0,y) = \frac{y}{x}\\

A(1,y) = \frac{\log(y)}{\log(x)}\\

x \langle 0 \rangle y = x+y\\

x \langle 1 \rangle y = x \cdot y\\

\end{align}

\]

These also satisfy Goodstein's equation...

I am not going to release the code or the paper yet, this is just a rough run through. But it provides the solution to our problem, full stop. It is analytic at the boundary values \(s=0,1,2\), and converges relatively fast. I am grueling away at trying to prove all of this rigorously now. But the code is converging!!!!! Albeit, much slower than I hoped . This is definitely because I am using a Newtonian root finder to find the value of \(\varphi\), which is cheating, I know. But it confirms that such a \(\varphi\) exists.

My code so far is producing correct values, but it does need a good amount of tweaking. I haven't been able to run any graphs yet, but I will in the future. The code is just too slow at the moment for it to be practical, so I have to fix my code somehow to more efficiently produce \(\varphi\). We shall see soon!!!!!!!!!!

I just thought I'd give a progress update. I should have a working draft of the paper and the code in a month's time (hopefully not later). This is absolutely fascinating.

Let the old thread die, and keep this thread as the center of discussion. I have answers to any and all questions and I'd be happy to explain more. This does get a little funky and complicated. I know I'm not there yet, I just have some strong numerical evidence, and a rough explanation of the theory. I hope to remedy this by mid July. We shall see!!

Regards, James.

I thought I'd add that, we are solving a change of variables in \(\varphi_1,\varphi_2,\varphi_3\) such that, all we need to do is solve one variable. Which, boils into finding a constant \(\varphi\) such that:

\[

x \langle s \rangle_\varphi y = x \langle s+1\rangle_{\varphi} \left(\alpha_\varphi(s,y) + 1\right)\\

\]

Where the solution for this equation satisfies \(\varphi(s,x \langle s \rangle_\varphi y) = \varphi(s,y)\)... I might not have made this clear. This means we are solving an inherently iterative procedure. \(\varphi\) becomes the fixed point of an iteration.

We're trying to show, in no words less, that the modified Bennet operators, can be corrected to be holomorphic semi-operators.

To be clear, we give a dictionary of our variables:

\[

\begin{align}

x &> e\\

y & >e\\

&\exp^{\circ s}_{y^{1/y}}(w) \,\,\text{is the repelling Schroder iteration about the repelling fixed point}\,\,y\,\,\text{valid for}\,\,w>e\\

x[s]y &= \exp^{\circ s}_{y^{1/y}}\left(\log^{\circ s}_{y^{1/y}}(x) + y\right)\\

x\langle s\rangle_{\varphi}y &= \exp^{\circ s}_{y^{1/y}}\left(\log^{\circ s}_{y^{1/y}}(x) + y + \varphi\right)\\

\end{align}

\]

Where the ultimate goal is to describe the implicit function \(\varphi(s,x,y)\) such that:

\[

x \langle s \rangle y = x\langle s\rangle_{\varphi}y\\

\]

And this operator satisfies the Goodstein equations:

\[

x \langle s \rangle \left(x \langle s+1\rangle y\right) = x \langle s+1\rangle (y+1)\\

\]

Obviously this is a very difficult problem. I have talked to Mphlee through PMs a bit about this, and he has helped tremendously. But I'm still not there. But what I am very close to is a rough draft of a solution. I have many ways of describing what the solution should look like, how \(\varphi\) looks. From this I've created a few observations crucial to the study.

There's an abel identity integral to this solution.

Let's write

\[

f(s,y) = x \langle s+1\rangle_\varphi^{-1}y\\

\]

Which is the inverse of \(x \langle s+1\rangle_\varphi y\) in \(y\). The original modified bennet operators, satisfy:

\[

f(s,x[s]y) = f(s,y) + 1 + o(y^{\epsilon}) \,\,\text{for all}\,\,\epsilon>0\\

\]

The error grows logarithmically, if you are curious. The exact solution is given as:

\[

f(s,x\langle s\rangle y) = f(s,y) + 1\\

\]

For some value \(f\) and some exact function \(\varphi\).

MphLee's family is normal.

There's a very specific family of function I defined off of Mphlee's comments and descriptions. In many ways, it's a family of functions \(x \langle s \rangle_\varphi y\) which are in the neighborhood of solving the above equations. This reduces into functions \(g(s,y) = x \langle s \rangle_\varphi y\), such that they satisfy the crucial identity:

\[

g^{-1}(s+1,g(s,y)) = g^{-1}(s+1,y) + 1 + o(y^\epsilon)\\

\]

While additionally interpolating addition, multiplication and exponentiation. The most central result I've been able to show, is that the modified Bennet operators \(f(s,y) = x[s]y\) satisfy this identity. And by proxy, elements which are in a neighborhood satisfy this identity. I've come to call this Mphlee's family. As it relates greatly to his study of rank operators, and the semi-operators from his lens (at least functorially).

This family is normal if you set \(y>Y\), which means it is locally bounded everywhere. And here is where things kick in to highgear. Every sequence of functions in MphLee's family has a converging subsequence. It is compact.

From here we introduce the dark element. The solution exists somewhere in here. We just need to find a working iteration within this theory that works.... Here it is.

We are going to make a change of variables, so that:

\[

x \langle s+1 \rangle_\varphi \alpha_\varphi(s,y) = y\\

\]

And we are going to look at the value \(\varphi\) such that:

\[

\alpha_\varphi(s,x\langle s\rangle_\varphi y) - \alpha_\varphi(s,y) - 1 = 0\\

\]

This value \(\varphi\) always exists. From this, we know that:

\[

\alpha_\varphi\\

\]

Is the Abel function of \(x \langle s \rangle_\varphi y\). What this tells us next, is the very very very interesting part. Recalling that \(\varphi\) moves with \(s\) and \(y\). We can safely conclude that:

\[

\varphi(s,x\langle s \rangle_\varphi y) = \varphi(s,y)\\

\]

In which, it is idempotent. This means, if I write out our expressions in more detail:

\[

\begin{align}

A(s,y) &= \alpha_{\varphi(s,y)}(s,y)\\

x \langle s \rangle y &= x \langle s \rangle_{\varphi(s,y)} y\\

\end{align}

\]

Then:

\[

\begin{align}

A(s,x \langle s \rangle y) = A(s,y) + 1\\

A(0,y) = \frac{y}{x}\\

A(1,y) = \frac{\log(y)}{\log(x)}\\

x \langle 0 \rangle y = x+y\\

x \langle 1 \rangle y = x \cdot y\\

\end{align}

\]

These also satisfy Goodstein's equation...

I am not going to release the code or the paper yet, this is just a rough run through. But it provides the solution to our problem, full stop. It is analytic at the boundary values \(s=0,1,2\), and converges relatively fast. I am grueling away at trying to prove all of this rigorously now. But the code is converging!!!!! Albeit, much slower than I hoped . This is definitely because I am using a Newtonian root finder to find the value of \(\varphi\), which is cheating, I know. But it confirms that such a \(\varphi\) exists.

My code so far is producing correct values, but it does need a good amount of tweaking. I haven't been able to run any graphs yet, but I will in the future. The code is just too slow at the moment for it to be practical, so I have to fix my code somehow to more efficiently produce \(\varphi\). We shall see soon!!!!!!!!!!

I just thought I'd give a progress update. I should have a working draft of the paper and the code in a month's time (hopefully not later). This is absolutely fascinating.

Let the old thread die, and keep this thread as the center of discussion. I have answers to any and all questions and I'd be happy to explain more. This does get a little funky and complicated. I know I'm not there yet, I just have some strong numerical evidence, and a rough explanation of the theory. I hope to remedy this by mid July. We shall see!!

Regards, James.

I thought I'd add that, we are solving a change of variables in \(\varphi_1,\varphi_2,\varphi_3\) such that, all we need to do is solve one variable. Which, boils into finding a constant \(\varphi\) such that:

\[

x \langle s \rangle_\varphi y = x \langle s+1\rangle_{\varphi} \left(\alpha_\varphi(s,y) + 1\right)\\

\]

Where the solution for this equation satisfies \(\varphi(s,x \langle s \rangle_\varphi y) = \varphi(s,y)\)... I might not have made this clear. This means we are solving an inherently iterative procedure. \(\varphi\) becomes the fixed point of an iteration.