(05/03/2022, 12:16 PM)tommy1729 Wrote:(03/23/2022, 03:19 AM)JmsNxn Wrote: Hey everyone! Some more info dumps!
I haven't talked too much about holomorphic semi-operators for a long time. For this brief exposition I'm going to denote the following:
\[
\begin{align}
x\,<0>\,y &= x+y\\
x\,<1>\,y &= x\cdot y\\
x\,<2>\,y &= x^y\\
\end{align}
\]
Where we have the identity: \(x<k>(x<k+1>y) = x<k+1> y+1\). Good ol fashioned hyper-operators.
...
First note :
In my notebook - and maybe posted here too - i found the identity: \((x<k+1>y) <k>x= x<k+1> y+1\) is consistant with
\[
\begin{align}
x\,<0>\,y &= x+y\\
x\,<1>\,y &= x\cdot y\\
x\,<2>\,y &= x^y\\
\end{align}
\]
This seems a nicer choice or not ?
Why not this ? because it is slower ?
Second note : Im going to ignore holomorphic for now because I do not believe that. Might explain later ( more time ) and maybe already did in the past.
Third note :
Which fractional iteration for exp and log ?? There are many and they do not agree on the 2 real fixpoints or ( in case of base e^(1/e) a single real fixpoint that is not analytic ! )
These problems and choices are not simultaniously adressed , picked and motivated.
Fourth note : why non-commutative ?
Fifth note :
you basicly are looking for a function f_1(a,b,s) and " find " the solution f_2(a ,b ,s , f_3(a,b,s) ) where f_3(a,b,s ) is unknown , undefined and unproven analytic.
that feels like solving the quintic polynomial as exp(a_0) + f(a_0,a_2,a_3,a_4,a_5) for some unknown f ...
forgive my parody.
I could continue but I respect you
regards
tommy1729
1) I don't want left associative, who wants left associative....
2) It is certainly holomorphic. I have infinite choices, I'm just looking for the correct choice at the moment. I have a holomorphic expansion in my hands, that you can calculate the taylor series of...
3) I explained which choice of iteration I'm using, but I did play pretty fast and loose. We don't have to use beta (but it's certainly preferrable, since I'm sticking to a rough outline at the moment, I'll give the rough layout), I initially thought it'd be easier--but it's computationally exhausting, so instead I chose the alternate. For example, take \(\sqrt{2}\); we have two tetration functions. One which tends to \(2\) at infinity, and one which tends to infinity. For simplicity I'll stick to the real line; which is really the only place this comparison is viable, at the moment.
\[
\exp^{\circ s}(x)\\
\]
For \(x \in (-\infty,4)\), use the iteration about \(2\). Similarly, for \(x \in (2,\infty)\) use the iteration about \(4\). These functions agree on the line \((2,4)\) (upto small discrepancies), and therefore there's an iteration \(\exp^{\circ s}(x)\) for \(x \in \mathbb{R}\) (of course, depending on \(s\) we may have a branching problem). There. There's my iteration. It works perfectly fine, for now. The same principle holds for all \(y >1\). Even when \(\eta =e^{1/e}\), you use the unbounded or the bounded solution, doesn't matter, still produces a holomorphic iteration... upto the branching problem at both fixed points, which I'm not worried about momentarily.
The trouble is when you start talking about \(s\); both of these iterations have different periods. I'm not too worried about that at this moment. And why the \(\beta\) method is superior in my eyes, as you can fix the period for both iterations.
I understand your worry with how haphazardly I'm writing the iteration of the exponential, but I'm not two worried at this point of time. I need to write the details out much better, but the core is still there. I'm concerning myself with \(\varphi\) first, and then I'll concern myself with a perfectly constructed holomorphic function:
\[
\exp^{\circ s}(x)\\
\]
I'm not too concerned about the branching problem at \(2\) and at \(4\) right now, but this would be the easy part compared to \(\varphi\). As this would just imply that there are branches in \(x,y,s\) in the expression \(x <s> y\). Which I'm fully aware of. I would not be surprised if they disappear though. Honestly, if you stick to the repelling case primarily, and focus solely on \(x > \eta\) and \(y \ge e\) you will never encounter this problem--so long as you stick to the repelling iteration.
This is what I mean by how I am not looking forward to the case \(y = e\) and neighborhoods of it. I wouldn't be surprised if we end up with one solution \(x <s> y\) for \(y \ge e\) and another solution \(x <s> y\) for \(1/e \le y \le e\), and for \(y<1/e\) we'd get some unholy complex mess of a solution.
But if it makes you feel better.
Assume that \(y > e\), and assume \(\exp^{\circ s}_{y^{1/y}}(x)\) is the repelling iteration about the repelling fixed point \(y\). Which is holomorphic for \(x > y\). With a branching problem at \(y\). Then we're only concerned with checking the equation when \(x > y+1\). I think this really castrates the problem though, as we're going to have to talk about branching at some point. Might as well get a head start... So I'm doing that by only referring to local holomorphy, which is always true. My iteration \(\exp^{\circ s}(x)\) is locally holomorphic everywhere, except at \(x = 4\).
You can make this better too, instead of \(x > y\) all you need is \(\log^{\circ s}(x) > 0\) for \(0 \le \Re(s) \le 2\). Which essentially reduces to \(x > y^{1/y}\). So if we're setting \(y \ge e\), let's stick to \(x > \eta\) and \(y > e\). Then the expression:
\[
\exp^{\circ s}_{y^{1/y}}\left(\log^{\circ s}_{y^{1/y}}(x) + y\right)\\
\]
Is analytic. If it makes you happy, let's stick to that. And now we're solving for \(\varphi \approx 0\) which let's us solve goodstein's equation.
4)
Of course it's non-commutative. I think you'd be hard pressed to genuinely believe that there'd be a commutative operator between addition and multiplication that is still holomorphic. I mean come on...
\[
a <s> b = b <s> a\\
\]
Well then... \(a^b \neq b^a\), done, contradiction. Can't be commutative. Sure an odd operator may be commutative, but good luck finding one.
5)
Well I'm not sure what this means, so I won't take it as a sleight.
I'm still roughing out how to do this, but locally everything is kosher. The trouble I'm having is making they can be pasted together properly using the monodromy theorem.
6)
Also your parody is pretty much exactly what I'm doing. But I'm not trying to solve in radicals the roots of a quintic polynomial. I'm just trying to describe the surface and achieve a taylor expansion, which is perfectly possible (upto branching of course).


